My name is Jeremy Keith. I make websites at Clearleft, the design agency I co‐founded in 2005. My online home is adactio.com where I’ve been jotting down my thoughts for over fifteen years.
The idea of shearing layers can also be applied to our creations on the web. Our domain names are the geological sites upon which we build. At the other end of the timescale, content on the web—the “stuff”—can be added and updated by the hour, the minute, or even the second. In between are the layers of structure, presentation, and behaviour: HTML, CSS, and JavaScript.
68 |
69 |
Those layers can be loosely‐coupled, but they aren’t completely independent. Just as a building cannot have furniture without first having rooms and walls, a style sheet needs some markup to act upon. The coupling between structure and presentation is handled through selectors in CSS: element selectors, class selectors, and so on. With JavaScript, the coupling is handled through the vocabulary of the Document Object Model, or DOM.
70 |
71 |
In a later book, The Clock Of The Long Now, Stewart Brand applied the idea of shearing layers—or pace layers—to civilisation itself. The slowest moving layer is nature, then there’s culture, followed by governance, then infrastructure, and finally commerce and fashion are the fastest layers. In a loosely‐coupled way, each layer depends on the layer below. In turn, the accumulation of each successive layer enables an “adjacent possible” filled with more opportunities.
72 |
73 |
Likewise, the expressiveness of CSS and JavaScript is only made possible on a foundation of HTML, which itself requires a URL to be reachable, which in turn depends on the HyperText Transfer Protocol, which sits atop the bedrock of TCP/IP.
74 |
75 |
Each of the web’s shearing layers can be peeled back to reveal a layer below. Running that process in reverse—applying each layer in turn—is a key principle of resilient web design.
76 |
77 |
78 |
79 |
Progressive enhancement
80 |
81 |
In 2003, the South by Southwest festival in Austin, Texas was primarily an event for musicians and filmmakers. Today the music and film portions are eclipsed by the juggernaut of South by Southwest Interactive, dedicated to all things digital. In 2003, South by Southwest Interactive was a small affair, squeezed into one corner of one floor of the Austin Convention Center. It was a chance for a few web designers and bloggers to get together and share ideas. That year, Steven Champeon and Nick Finck presented a talk entitled Inclusive Web Design For the Future with Progressive Enhancement. They opened with this call to arms:
82 |
83 |
84 |
Web design must mature and accept the developments of the past several years, abandon the exclusionary attitudes formed in the rough and tumble dotcom era, realize the coming future of a wide variety of devices and platforms, and separate semantic markup from presentation logic and behavior.
85 |
86 |
87 |
Like Tim Berners‐Lee, Steven Champeon had experience of working with SGML, the markup language that would so heavily influence HTML. In dealing with documents that needed to be repurposed for different outputs, he came to value the separation of structure from presentation. A meaningfully marked up document can be presented in multiple ways through the addition of CSS and JavaScript.
88 |
89 |
This layered approach to the web allows the same content to be served up to a wide variety of people. But this doesn’t mean that everyone gets the same experience. Champeon realised that a strong separation of concerns would allow enhancements to be applied according to the capabilities of the end user’s device.
90 |
91 |
To paraphrase Karl Marx, progressive enhancement allows designers to ask from each browser according to its ability, and to deliver to each device according to its needs.
92 |
93 |
94 |
95 |
96 |
97 |
Do websites need to look exactly the same in every browser?
98 |
99 |
Some web designers were concerned that progressive enhancement would be a creative straitjacket. Designing for the lowest common denominator did not sound like a recipe for progress. But this was a misunderstanding. Progressive enhancement asks that designers start from the lowest common denominator (a well marked‐up document), but there is no limit to where they can go from there.
100 |
101 |
In fact, it’s the very presence of a solid baseline of HTML that allows web designers to experiment with the latest and greatest CSS. Thanks to Postel’s Law and the loose error‐handling model of CSS, designers are free to apply styles that only work in the newest browsers.
102 |
103 |
This means that not everyone will experience the same visual design. This is not a bug. This is a feature of the web. New browsers and old browsers; monochrome displays and multi‐coloured displays; fast connections and slow connections; big screens, small screens, and no screens; everyone can access your content. That content should look different in such varied situations. If a website looks the same on a ten‐year old browser as it does in the newest devices, then it probably isn’t taking advantage of the great flexibility that the web offers.
104 |
105 |
To emphasise this, designer Dan Cederholm created a website to answer the question, “Do websites need to look exactly the same in every browser?” You can find the answer to that question at the URL:
At the risk of spoiling the surprise for you, the answer is a resounding “No!” If you visit that website, you will see that answer proudly displayed. But depending on the capabilities of your browser, you may or may not see some of the stylistic flourishes applied to that single‐word answer. Even if you don’t get any of the styles, you’ll still get the content marked up with semantic HTML.
110 |
111 |
112 |
113 |
114 |
115 |
Cutting the mustard
116 |
117 |
Separating structure and presentation is relatively straightforward. You can declare whatever styles you want, safe in the knowledge that browsers will ignore what they don’t understand. Separating structure and behaviour isn’t quite so easy. If you give a browser some JavaScript that it doesn’t understand, not only will it not apply the desired behaviour, it will refuse to parse the rest of the JavaScript.
118 |
119 |
Before you use a particular feature in JavaScript, it’s worth testing to see if that feature is supported. This kind of feature detection can save your website’s visitors from having a broken experience because of an unsupported feature. If you want to use Ajax, check first that the browser supports the object you’re about to use to enable that Ajax functionality. If you want to use the geolocation API, check first that the browser supports it.
120 |
121 |
A team of web developers working on the BBC news website referred to this kind of feature detection as cutting the mustard. Browsers that cut the mustard get an enhanced experience. Browsers that don’t cut the mustard still get access to the content, but without the JavaScript enhancements.
122 |
123 |
Feature detection, cutting the mustard, whatever you want to call it, is a fairly straightforward technique. Let’s say you want to traverse the DOM using querySelector and attach events to some nodes in the document using addEventListener. Your mustard‐cutting logic might look something like this:
This is feature detection, not browser detection. Instead of asking “which browser are you?” and trying to infer feature support from the answer, it is safer to simply ask “do you support this feature?”
133 |
There is no else statement.
134 |
135 |
136 |
137 |
138 |
139 |
140 |
Aggressive enhancement
141 |
142 |
Back when web designers were trying to exert print‐like control over web browsers, a successful design was measured in pixel perfection: did the website look exactly the same in every browser?
143 |
144 |
Unless every browser supported a particular feature—like, say, rounded corners in CSS—then that feature was off the table. Instead, designers would fake it with extra markup and images. The resulting websites lacked structural honesty. Not only was this a waste of talent and energy on the part of the designers, it was a waste of the capabilities of modern web browsers.
145 |
146 |
The rise of mobiles, tablets, and responsive design helped to loosen this restrictive mindset. It is no longer realistic to expect pixel‐perfect parity across devices and browsers. But you’ll still find web designers bemoaning the fact that they have to support an older outdated browser because a portion of their audience are still using it.
147 |
148 |
They’re absolutely right. Anyone using that older browser should have access to the same content as someone using the latest and greatest web browser. But that doesn’t mean they should get the same experience. As Brad Frost puts it:
149 |
150 |
151 |
There is a difference between support and optimization.
152 |
153 |
154 |
Support every browser ...but optimise for none.
155 |
156 |
Some designers have misunderstood progressive enhancement to mean that all functionality must be provided to everyone. It’s the opposite. Progressive enhancement means providing core functionality to everyone. After that, it’s every browser for itself. Far from restricting what features you can use, progressive enhancement provides web designers with a way to safely use the latest and greatest features without worrying about older browsers. Scott Jehl of the agency Filament Group puts it succinctly:
157 |
158 |
159 |
Progressive Enhancement frees us to focus on the costs of building features for modern browsers, without worrying much about leaving anyone out. With a strongly qualified codebase, older browser support comes nearly for free.
160 |
161 |
162 |
If a website is built using progressive enhancement then it’s okay if a particular feature isn’t supported or fails to load: Ajax, geolocation, whatever. As long as the core functionality is still available, web designers don’t need to bend over backwards trying to crowbar support for newer features into older browsers.
163 |
164 |
You also get a website that’s more resilient to JavaScript’s error‐handling model. Mat Marquis worked alongside Scott Jehl on the responsive website for the Boston Globe. He noted:
165 |
166 |
167 |
Lots of cool features on the Boston Globe don’t work when JS breaks; “reading the news” is not one of them.
168 |
169 |
170 |
The trick is identifying what is considered core functionality and what is considered an enhancement.
With a title like Resilient Web Design, you might think that this is a handbook for designing robust websites. This is not a handbook. It’s more like a history book.
311 |
312 |
Marshall McLuhan once said:
313 |
314 |
315 |
We look at the present through a rear‐view mirror. We march backwards into the future.
316 |
317 |
318 |
But in the world of web design, we are mostly preoccupied with the here and now. When we think beyond our present moment, it is usually to contemplate the future—to imagine the devices, features, and interfaces that don’t yet exist. We don’t have time to look back upon our past, and yet the history of web design is filled with interesting ideas.
319 |
320 |
The World Wide Web has been around for long enough now that we can begin to evaluate the twists and turns of its evolution. I wrote this book to highlight some of the approaches to web design that have proven to be resilient. I didn’t do this purely out of historical interest (although I am fascinated by the already rich history of our young industry). In learning from the past, I believe we can better prepare for the future.
321 |
322 |
You won’t find any code in here to help you build better websites. But you will find ideas and approaches. Ideas are more resilient than code. I’ve tried to combine the most resilient ideas from the history of web design into an approach for building the websites of the future.
323 |
324 |
I hope you will join me in building a web that lasts; a web that’s resilient.
With a title like Resilient Web Design, you might think that this is a handbook for designing robust websites. This is not a handbook. It’s more like a history book.
41 |
42 |
Marshall McLuhan once said:
43 |
44 |
45 |
We look at the present through a rear‐view mirror. We march backwards into the future.
46 |
47 |
48 |
But in the world of web design, we are mostly preoccupied with the here and now. When we think beyond our present moment, it is usually to contemplate the future—to imagine the devices, features, and interfaces that don’t yet exist. We don’t have time to look back upon our past, and yet the history of web design is filled with interesting ideas.
49 |
50 |
The World Wide Web has been around for long enough now that we can begin to evaluate the twists and turns of its evolution. I wrote this book to highlight some of the approaches to web design that have proven to be resilient. I didn’t do this purely out of historical interest (although I am fascinated by the already rich history of our young industry). In learning from the past, I believe we can better prepare for the future.
51 |
52 |
You won’t find any code in here to help you build better websites. But you will find ideas and approaches. Ideas are more resilient than code. I’ve tried to combine the most resilient ideas from the history of web design into an approach for building the websites of the future.
53 |
54 |
I hope you will join me in building a web that lasts; a web that’s resilient.
55 |
56 |
57 |
58 |
73 |
74 |
78 |
79 |
80 |
81 |
82 |
--------------------------------------------------------------------------------
/js/async-waituntil.js:
--------------------------------------------------------------------------------
1 | {
2 | const waitUntil = ExtendableEvent.prototype.waitUntil;
3 | const respondWith = FetchEvent.prototype.respondWith;
4 | const promisesMap = new WeakMap();
5 |
6 | ExtendableEvent.prototype.waitUntil = function(promise) {
7 | const extendableEvent = this;
8 | let promises = promisesMap.get(extendableEvent);
9 |
10 | if (promises) {
11 | promises.push(Promise.resolve(promise));
12 | return;
13 | }
14 |
15 | promises = [Promise.resolve(promise)];
16 | promisesMap.set(extendableEvent, promises);
17 |
18 | // call original method
19 | return waitUntil.call(extendableEvent, Promise.resolve().then(function processPromises() {
20 | const len = promises.length;
21 |
22 | // wait for all to settle
23 | return Promise.all(promises.map(p => p.catch(()=>{}))).then(() => {
24 | // have new items been added? If so, wait again
25 | if (promises.length != len) return processPromises();
26 | // we're done!
27 | promisesMap.delete(extendableEvent);
28 | // reject if one of the promises rejected
29 | return Promise.all(promises);
30 | });
31 | }));
32 | };
33 |
34 | FetchEvent.prototype.respondWith = function(promise) {
35 | this.waitUntil(promise);
36 | return respondWith.call(this, promise);
37 | };
38 | }
39 |
--------------------------------------------------------------------------------
/js/scripts.js:
--------------------------------------------------------------------------------
1 | 'use strict';
2 |
3 | if ('serviceWorker' in navigator) {
4 | // If service workers are supported
5 | navigator.serviceWorker.register('/serviceworker.js');
6 | } else if ('applicationCache' in window) {
7 | // Otherwise inject an iframe to use appcache
8 | var iframe = document.createElement('iframe');
9 | iframe.setAttribute('src', '/appcache.html');
10 | iframe.setAttribute('style', 'width: 0; height: 0; border: 0');
11 | document.querySelector('footer').appendChild(iframe);
12 | }
13 |
14 | function getElementsByText(scope, text) {
15 | // iterate descendants of scope
16 | for (var all = scope.childNodes, index = 0, element, list = []; (element = all[index]); ++index) {
17 | // conditionally return element containing visible, whitespace-insensitive, case-sensitive text (a match)
18 | if (element.nodeType === 1 && (element.innerText || element.textContent || '').replace(/\s+/g, ' ').indexOf(text.replace(/\s+/g, ' ')) !== -1) {
19 | list = list.concat(getElementsByText(element, text));
20 | }
21 | }
22 |
23 | // return scope (no match)
24 | return list.length ? list : scope;
25 | }
26 |
27 | // update URL hash when text is selected
28 | (function (win, doc) {
29 | // cut the mustard
30 | if (!win.getSelection || !encodeURIComponent) {
31 | return;
32 | }
33 | function updateURL() {
34 | var selection = win.getSelection();
35 | if (selection) {
36 | var selectedText = selection.toString(),
37 | selectedNode = selection.anchorNode && selection.anchorNode.parentElement;
38 | if (selectedText.length > 1) {
39 | var hash = '#' + encodeURIComponent(selectedText);
40 |
41 | var elements = getElementsByText(document, selectedText),
42 | length = elements.length,
43 | which = length && elements.indexOf(selectedNode);
44 |
45 | if (which && which > 0) {
46 | hash += '++' + which;
47 | }
48 |
49 | if(win.history && win.history.pushState) {
50 | win.history.pushState(null, null, hash);
51 | } else if (win.location && win.location.hash) {
52 | win.location.hash = hash;
53 | }
54 | }
55 | }
56 | }
57 | doc.body.addEventListener('mouseup', updateURL, false);
58 | doc.body.addEventListener('touchend', updateURL, false);
59 | })(window, window.document);
60 |
61 | // detect native/existing fragmention support
62 | if (!('fragmention' in window.location)) (function () {
63 | // populate fragmention
64 | location.fragmention = location.fragmention || '';
65 |
66 | // return first element in scope containing case-sensitive text
67 | // on dom ready or hash change
68 | function onHashChange() {
69 | // do nothing if the dom is not ready
70 | if (!/e/.test(document.readyState)) return;
71 |
72 | // set location fragmention as uri-decoded text (from href, as hash may be decoded)
73 | var
74 | id = location.href.match(/#((?:#|%23)?)(.+)/) || [0,'',''],
75 | node = document.getElementById(id[1]+id[2]),
76 | match = decodeURIComponent(id[2].replace(/\+/g, ' ')).split(' ');
77 |
78 | location.fragmention = match[0];
79 | location.fragmentionIndex = parseFloat(match[1]) || 0;
80 |
81 | // conditionally remove stashed element fragmention attribute
82 | if (element) {
83 | element.removeAttribute('fragmention');
84 |
85 | // DEPRECATED: trigger style in IE8
86 | if (element.runtimeStyle) {
87 | element.runtimeStyle.windows = element.runtimeStyle.windows;
88 | }
89 | }
90 |
91 | // if fragmention exists
92 | if (!node && location.fragmention) {
93 | var
94 | // get all elements containing text (or document)
95 | elements = getElementsByText(document, location.fragmention),
96 | // get total number of elements
97 | length = elements.length,
98 | // get index of element
99 | modulus = length && location.fragmentionIndex % length,
100 | index = length && modulus >= 0 ? modulus : length + modulus;
101 |
102 | // get element
103 | element = length && elements[index];
104 |
105 | // if element found
106 | if (element) {
107 | // scroll to element
108 | element.scrollIntoView();
109 |
110 | // set fragmention attribute
111 | element.setAttribute('fragmention', '');
112 |
113 | // DEPRECATED: trigger style in IE8
114 | if (element.runtimeStyle) {
115 | element.runtimeStyle.windows = element.runtimeStyle.windows;
116 | }
117 | }
118 | // otherwise clear stashed element
119 | else {
120 | element = null;
121 | }
122 | }
123 | }
124 |
125 | var
126 | // DEPRECATED: configure listeners
127 | defaultListener = 'addEventListener',
128 | addEventListener = defaultListener in window ? [defaultListener, ''] : ['attachEvent', 'on'],
129 | // set stashed element
130 | element;
131 |
132 | // add listeners
133 | window[addEventListener[0]](addEventListener[1] + 'hashchange', onHashChange);
134 | document[addEventListener[0]](addEventListener[1] + 'readystatechange', onHashChange);
135 |
136 | onHashChange();
137 | })();
138 |
--------------------------------------------------------------------------------
/manifest.appcache:
--------------------------------------------------------------------------------
1 | CACHE MANIFEST
2 | # rev 41
3 |
4 | CACHE:
5 | /
6 | /introduction/
7 | /chapter1/
8 | /chapter2/
9 | /chapter3/
10 | /chapter4/
11 | /chapter5/
12 | /chapter6/
13 | /chapter7/
14 | /author/
15 | /contents/
16 | /theindex/
17 | /css/styles.css
18 | /js/scripts.js
19 | /fonts/etbookot-roman-webfont.woff2
20 | /fonts/etbookot-italic-webfont.woff2
21 | /fonts/etbookot-bold-webfont.woff2
22 | /chapter1/images/small/enquire-within-upon-everything.jpg
23 | /chapter1/images/small/information-management.jpg
24 | /chapter1/images/small/line-mode-browser.jpg
25 | /chapter1/images/small/sir-tim-berners-lee.jpg
26 | /chapter1/images/small/submarine-cable-map.jpg
27 | /chapter1/images/small/sundial.jpg
28 | /chapter1/images/small/vague-but-exciting.jpg
29 | /chapter1/images/medium/enquire-within-upon-everything.jpg
30 | /chapter1/images/medium/information-management.jpg
31 | /chapter1/images/medium/line-mode-browser.jpg
32 | /chapter1/images/medium/sir-tim-berners-lee.jpg
33 | /chapter1/images/medium/submarine-cable-map.jpg
34 | /chapter1/images/medium/sundial.jpg
35 | /chapter1/images/medium/vague-but-exciting.jpg
36 | /chapter1/images/large/enquire-within-upon-everything.jpg
37 | /chapter1/images/large/information-management.jpg
38 | /chapter1/images/large/line-mode-browser.jpg
39 | /chapter1/images/large/sir-tim-berners-lee.jpg
40 | /chapter1/images/large/submarine-cable-map.jpg
41 | /chapter1/images/large/sundial.jpg
42 | /chapter1/images/large/vague-but-exciting.jpg
43 | /chapter2/images/small/zengarden.png
44 | /chapter2/images/medium/zengarden.png
45 | /chapter2/images/large/zengarden.png
46 | /chapter3/images/small/book-of-kells.jpg
47 | /chapter3/images/small/gutenberg-bible.jpg
48 | /chapter3/images/small/iphone.jpg
49 | /chapter3/images/small/jan-tchichold-medieval-manuscript-framework.png
50 | /chapter3/images/medium/book-of-kells.jpg
51 | /chapter3/images/medium/gutenberg-bible.jpg
52 | /chapter3/images/medium/iphone.jpg
53 | /chapter3/images/medium/jan-tchichold-medieval-manuscript-framework.png
54 | /chapter3/images/large/book-of-kells.jpg
55 | /chapter3/images/large/gutenberg-bible.jpg
56 | /chapter3/images/large/iphone.jpg
57 | /chapter3/images/large/jan-tchichold-medieval-manuscript-framework.png
58 | /chapter4/images/small/jon-postel.jpg
59 | /chapter4/images/small/nasa.png
60 | /chapter4/images/medium/jon-postel.jpg
61 | /chapter4/images/medium/nasa.png
62 | /chapter4/images/large/jon-postel.jpg
63 | /chapter4/images/large/nasa.png
64 | /chapter5/images/small/shearing-layers.jpg
65 | /chapter5/images/medium/shearing-layers.jpg
66 | /chapter5/images/large/shearing-layers.jpg
67 | /chapter6/images/small/duck.jpg
68 | /chapter6/images/small/news.png
69 | /chapter6/images/small/photosharing.png
70 | /chapter6/images/small/socialnetworks.png
71 | /chapter6/images/small/writing.png
72 | /chapter6/images/medium/duck.jpg
73 | /chapter6/images/medium/news.png
74 | /chapter6/images/medium/photosharing.png
75 | /chapter6/images/medium/socialnetworks.png
76 | /chapter6/images/medium/writing.png
77 | /chapter6/images/large/duck.jpg
78 | /chapter6/images/large/news.png
79 | /chapter6/images/large/photosharing.png
80 | /chapter6/images/large/socialnetworks.png
81 | /chapter6/images/large/writing.png
82 | /chapter7/images/small/devices.jpg
83 | /chapter7/images/small/grace-hopper.jpg
84 | /chapter7/images/small/future-friendly.png
85 | /chapter7/images/medium/devices.jpg
86 | /chapter7/images/medium/grace-hopper.jpg
87 | /chapter7/images/medium/future-friendly.png
88 | /chapter7/images/large/devices.jpg
89 | /chapter7/images/large/grace-hopper.jpg
90 | /chapter7/images/large/future-friendly.png
91 | /author/images/small/jeremykeith.jpg
92 | /author/images/medium/jeremykeith.jpg
93 | /author/images/large/jeremykeith.jpg
94 | /favicon.ico
95 |
96 | NETWORK:
97 | *
98 |
--------------------------------------------------------------------------------
/manifest.json:
--------------------------------------------------------------------------------
1 | {
2 | "lang": "en",
3 | "short_name": "Resilience",
4 | "name": "Resilient Web Design by Jeremy Keith",
5 | "description": "A web book in seven chapters on the past, present, and future of web design. By Jeremy Keith.",
6 | "icons": [{
7 | "src": "/images/icon48.png",
8 | "sizes": "48x48",
9 | "type": "image/png"
10 | }, {
11 | "src": "/images/icon72.png",
12 | "sizes": "72x72",
13 | "type": "image/png"
14 | }, {
15 | "src": "/images/icon96.png",
16 | "sizes": "96x96",
17 | "type": "image/png"
18 | }, {
19 | "src": "/images/icon144.png",
20 | "sizes": "144x144",
21 | "type": "image/png"
22 | }, {
23 | "src": "/images/icon168.png",
24 | "sizes": "168x168",
25 | "type": "image/png"
26 | }, {
27 | "src": "/images/icon192.png",
28 | "sizes": "192x192",
29 | "type": "image/png"
30 | }, {
31 | "src": "/images/icon250.png",
32 | "sizes": "250x250",
33 | "type": "image/png"
34 | }, {
35 | "src": "/images/icon300.png",
36 | "sizes": "300x300",
37 | "type": "image/png"
38 | }, {
39 | "src": "/images/icon512.png",
40 | "sizes": "512x512",
41 | "type": "image/png"
42 | }],
43 | "start_url": "/",
44 | "display": "standalone",
45 | "background_color": "#5f7995",
46 | "theme_color": "#5f7995"
47 | }
48 |
--------------------------------------------------------------------------------
/markdown/chapter01.md:
--------------------------------------------------------------------------------
1 | #Chapter 1: Foundations
2 |
3 | The history of human civilisation is a tale of cumulative effort. Each generation builds upon the work of their forebears. Sometimes the work takes a backward step. Sometimes we wander down dead ends. But we struggle on. Bit by bit our species makes progress. Whether the progress is incremental or a huge leap forward, it is always borne upon the accomplishments of those who came before us.
4 |
5 | Nowhere is this layered nature of progress more apparent than in the history of technology. Even the most dramatic bounds in technological advancement are only possible when there is some groundwork to build upon.
6 |
7 | Gutenberg's printing press would not have been invented if it weren't for the work already done in creating the screw press used for winemaking. Technologies aren't created in isolation. They are imprinted with the ghosts of their past.
8 |
9 | The layout of the QWERTY keyboard for your computer—and its software equivalent on your phone—is an echo of the design of the first generation of typewriters. That arrangement of keys was chosen to reduce the chances of mechanical pieces of metal clashing as they sprang forward to leave their mark on the paper.
10 |
11 | The hands on a clock face move in a clockwise direction only because that's the direction that the shadow cast by a sundial moves over the course of a day in the northern hemisphere. Had history turned out differently, with the civilisation of the southern hemisphere in the ascendent, then the hands on our clocks would today move in the opposite direction. As for why those clocks carve out the time in periods of 24 hours, each with 60 minutes, with each minute having 60 seconds, that's thanks to an ancient Sumerian civilisation. They hit upon using the number 60 as their base for counting and commerce. It's the lowest number that can be equally divided by the first six numbers. That sexagesimal way of counting is still with us today in the hours, minutes, and seconds that we use as conceptual models for subdividing one rotation of our planet.
12 |
13 | These echoes of the past reverberate in the present even when their usefulness has been outlived. You'll still sometimes see a user interface that displays an icon of a Compact Disc or vinyl record to represent music. That same interface might use the image of a 3.5 inch floppy disk to represent the concept of saving data. The reason why floppy disks wound up being 3.5 inches in size is because the disk was designed to fit into a shirt pocket. The icons in our software interfaces are whispering stories to us from the history of clothing and fashion.
14 |
15 | ##Share what you know
16 |
17 | Scientific progress would be impossible without a shared history of learning to build upon. As Sir Isaac Newton put it, if we have seen further it is by standing on the shoulders of giants.
18 |
19 | When knowledge is passed from one generation to the next, theories become more refined, units of measurement become standardised, and experiments increase in precision.
20 |
21 | Right now humanity's most precise experiments are being conducted beneath the border between Switzerland and France. This is the home of CERN, the European Organisation for Nuclear Research. In the 16-mile wide ring of its Large Hadron Collider, protons are being smashed together at velocities approaching the speed of light. Our primate species is recreating the conditions from the start of our universe. The LHC is the most complex piece of machinery that has ever been built.
22 |
23 | The awe-inspiring engineering of the LHC is matched by the unprecedented levels of international co-operation behind CERN. The particle accelerator became operational in the first decade of the 21st century but the groundwork was laid more than half a century before. That was when a group of nations came together to create the CERN Convention, dedicating resources and money towards pure scientific research. The only return on investment they expected was in the currency of knowledge.
24 |
25 | This groundwork created a unique environment free from the constraints of national, economic, and social hierarchies. Nobel prize-winning physicists collaborate with students on summer internships. If there is an element of social categorisation at CERN, it is only between theorists and experimentalists. The theorists are the ones with blackboards in their offices. The experimentalists are the ones with computers. They have to deal with a lot of data. Even before the Large Hadron Collider was switched on, managing information was a real challenge at CERN.
26 |
27 | Enter Tim Berners-Lee, a computer scientist from England who found himself working at CERN in the 1980s. At the beginning of that decade, he started a personal project to get to grips with managing information. The resulting software was called ENQUIRE, named for a Victorian manual of domestic life called _Enquire Within Upon Everything_.
28 |
29 | By the end of the '80s, Tim Berners-Lee was ready to tackle the thorny problem of information management on a larger scale. In order to get buy-in at CERN, he produced an unassuming document with the title _Information Management: A Proposal_. Fortunately his supervisor, Mike Sendall, recognised the potential of the idea and gave the go-ahead by scrawling the words "vague but exciting" across the top of the paper. That proposal would become the Wide World Web.
30 |
31 | ##Net value
32 |
33 | Today we think of the World Wide Web as one of the greatest inventions in the history of communication, but to the scientists at CERN it is merely a byproduct. When you're dealing in cosmological timescales and investigating the very building blocks of reality itself, the timeline of humanity's relationship with technology is little more than a rounding error.
34 |
35 | When Tim Berners-Lee decided to take on the problem of information management at CERN, the internet was already established as part of the infrastructure there. This network of networks was first created in the 1960s and the early adopters were universities and scientific institutions.
36 |
37 | These nodes were already physically connected via telephone wires. Rather than build an entirely new network from scratch, it made sense to reuse what was already there. Once again, a new technology was only made possible by the existence of an older one. In the nineteenth century the world was technologically terraformed for the telegraph. Through astonishing feats of engineering, our planet was wired up with undersea cables. Those same cables would later be repurposed to carry telephone signals. Later again, they would carry the digital ones and zeros of the internet. Today those signals are transmitted via pulses of light travelling through fibre-optic cables. Those fibre-optic cables follow the same paths across the ocean floor as their telegraphic antecedents.
38 |
39 | The internet has no centre. This architectural decision gives the network its robustness. You may have heard that the internet was designed to resist a nuclear attack. That's not entirely correct. It's true that the project began with military considerations. The initial research was funded by DARPA, the Defense Advanced Research Projects Agency. But the engineers working on the project were not military personnel. Their ideals had more in common with the free-speech movement than with the military-industrial complex. They designed the network to route around damage, but the damage they were concerned with was censorship, not a nuclear attack.
40 |
41 | The open architecture of the internet reflected the liberal worldview of its creators. As well as being decentralised, the internet was also deliberately designed to be a dumb network. That's its secret sauce. The protocols underlying the transmission of data on the internet—TCP/IP—describe how packets of data should be moved around, but those protocols care not a whit for the contents of the packets. That allows the internet to be the transport mechanism for all sorts of applications: email, Telnet, FTP, and eventually the World Wide Web.
42 |
43 | ##Hyperspace
44 |
45 | The web uses HTTP, the HyperText Transfer Protocol, to send and receive data. This data is uniquely identified with a URL. Many of these URLs identify pages made of HTML, the HyperText Markup Language. The killer feature of the web lies here in HTML's humble A element. The A stands for Anchor. Its href attribute allows you to cast off from within one URL out to another URL, creating a link that can be traversed from one page to the other. These links turn the web from being a straightforward storage and retrieval system into a hypertext system.
46 |
47 | Tim Berners-Lee did not invent hypertext. That term was coined by Ted Nelson, a visionary computer scientist who was working on his own hypertext system called Xanadu. Both Ted Nelson and Tim Berners-Lee were influenced by the ideas set out by Vannevar Bush in his seminal 1945 essay, _As We May Think_. Bush, no doubt, was in turn influenced by the ideas of Belgian informatician Paul Otlet. Each one of these giants in the history of hypertext was standing on the shoulders of the giants that had come before them. Giants all the way down.
48 |
49 | Compared to previous grand visions of hypertext, links on the web are almost laughably simplistic. There is no two-way linking. If you link to another URL, but the page at that URL is moved or destroyed, you have no way of knowing.
50 |
51 | But the simplicity of the web turned out to be the secret of its success.
52 |
53 | Tim Berners-Lee assumed that most URLs would point to non-HTML resources; word-processing documents, spreadsheets, and all sorts of other computer files. HTML could then be used to create simple index pages that point to these files using links. Because HTML didn't need to do very much, it had a limited vocabulary. That made it relatively easy to learn. To Tim Berners-Lee's surprise, people began to create fully-fledged documents in HTML. Instead of creating content in other file formats and using HTML to link them up, people began writing content directly in HTML.
54 |
55 | ##Mark me up, mark me down
56 |
57 | HTML wasn't the first markup language to be used at CERN. Scientists there were already sharing documents written in a flavour of SGML—Standard Generalized Markup Language. Tim Berners-Lee took this existing vocabulary from CERN SGML and used it as a starting point for his new markup language. Once again, it made sense to build on what people were already familiar with rather than creating something from scratch.
58 |
59 | The first version of HTML contained a grand total of 21 elements. Many of those elements are still with us today—TITLE, P, UL, LI, H1, H2, etc., and of course, the A element. Others have fallen by the wayside—ISINDEX, PLAINTEXT, LISTING, HP1, HP2, etc., as well as a proprietary element called NEXTID that only made sense if you were using a computer running the NeXTSTEP operating system. That's the OS that Tim Berners-Lee was using when he created HTTP, HTML, and the world's first web browser, called confusingly WorldWideWeb, which only worked on NeXT machines.
60 |
61 | To demonstrate the power and interoperability of the web, a cross-platform browser was needed; one that anybody could install and use, no matter what operating system they were using. The task of building that browser fell to an undergraduate at CERN named Nicola Pellow. She created the Line Mode Browser. It was simple but powerful. It didn't have the same level of interactivity as the WorldWideWeb browser, but the fact that it could be run on any machine meant that the web was now accessible to everyone.
62 |
63 | As soon as there were two web browsers in the world, interoperability and backwards compatibility became important issues. For instance, what should the Line Mode Browser do when it encounters an HTML element it doesn't understand, such as NEXTID?
64 |
65 | The answer can be found in the sparse documentation that Tim Berners-Lee wrote for his initial collection called _HTML Tags_. Under the heading "Next ID" he wrote:
66 |
67 | > Browser software may ignore this tag.
68 |
69 | This seemingly innocuous decision would have far-reaching consequences for the future of the World Wide Web.
--------------------------------------------------------------------------------
/markdown/chapter02.md:
--------------------------------------------------------------------------------
1 | #Chapter 2: Materials
2 |
3 | At the risk of teaching grandmother to suck eggs, I'd like you to think about what happens when a browser parses an HTML element. Take, for example, a paragraph element with some text inside it. There's an opening P tag, a closing P tag, and between those tags, there's the text.
4 |
5 |
some text
6 |
7 | A web browser encountering this element will display the text between the opening and closing tags. Now consider what happens when that same web browser encounters an element it doesn't recognise.
8 |
9 | some more text
10 |
11 | Once again, the browser displays the text between the opening and closing tags. What's interesting here is what the browser doesn't do. The browser does not throw an error. The browser does not stop parsing the HTML at this point, refusing to go any further. Instead, it simply ignores the tags and displays the content within.
12 |
13 | This liberal attitude to errors allowed the vocabulary of HTML to grow over time from the original 21 elements to the 121 elements in HTML5. Whenever a new element is introduced to HTML, we know exactly how older browsers will treat it; they will ignore the tags and display the content.
14 |
15 | That's a remarkably powerful feature. It allows browsers to implement new HTML features at different rates. We don't have to wait for every browser to recognise a new element. Instead we can start using the new element at any time, secure in the knowledge than non-supporting browsers won't choke on it.
16 |
17 | this text will display in any browser
18 |
19 | If web browsers treat all tags the same way—displaying their contents—then what's the point of having a vocabulary of elements in HTML?
20 |
21 | ##The meaning of markup
22 |
23 | Some HTML elements are literally meaningless. The SPAN element says nothing about the contents within it. As far as a web browser is concerned, you may as well use a non-existent MARKLAR element. But that's the exception. Most HTML elements exist for a reason. They have been created and agreed upon in order to account for specific situations that authors like you and I are likely to encounter.
24 |
25 | There are obviously special elements, like the A element, that come bundled with superpowers. In the case of the A element, its superpower lies in the HREF attribute that allows us to link out to any other resource on the web. Other elements like INPUT, SELECT, TEXTAREA, and BUTTON have their own superpowers, allowing people to enter data and submit it to a web server.
26 |
27 | Then there are elements that describe the kind of content they contain. The contents of a P element should be considered a paragraph of text. The contents of an LI element should be considered as an item in a list. Browsers display the contents of these elements with some visual hints as to their meaning. Paragraphs are displayed with whitespace before and after their content. List items are displayed with bullet points or numbers before their content.
28 |
29 | The early growth of HTML's vocabulary was filled with new elements that provided visual instructions to web browsers: BIG, SMALL, CENTER, FONT. In fact, the visual instructions were the only reason for those elements to exist—they provided no hint as to the *meaning* of the content they contained. HTML was in danger of becoming a visual instruction language instead of a vocabulary of meaning.
30 |
31 | ##A matter of style
32 |
33 | Håkon Wium Lie was working at CERN at the same time as Tim Berners-Lee. He immediately recognised the potential of the World Wide Web and its language, HTML. He also realised that the expressive power of the language was in danger of being swamped by visual features. Lie proposed a new format to describe the presentation of HTML documents: Cascading Style Sheets.
34 |
35 | He was quickly joined by the Dutch programmer Bert Bos. Together they set about creating a syntax that would be powerful enough to handle the demands of designers, while remaining simple enough to learn quickly. They succeeded.
36 |
37 | Think for a moment of all the sites out there on the web. There's a huge variation in visual style: colour schemes, typographic treatments, textures and layouts. All of that variety is made possible by one simple pattern that describes all the CSS ever written:
38 |
39 | selector {
40 | property: value;
41 | }
42 |
43 | That's it.
44 |
45 | CSS shares HTML's forgiving attitude to errors. If a web browser encounters a selector it doesn't understand, it simply skips over whatever is between that selector's curly braces. If a browser sees a property or a value it doesn't understand, it just ignores that particular declaration. The browser does not throw an error. The browser does not stop parsing the CSS at this point, refusing to go any further.
46 |
47 | marklar {
48 | marklar: marklar;
49 | }
50 |
51 | Just as with HTML, this loose error-handling has allowed CSS to grow over time. New selectors, new properties, and new values have been added to the language's vocabulary over the years. Whenever a new feature lands in CSS, designers and developers know that they can safely use it, even if it isn't yet widely supported in browsers. They can rest assured that old browsers will react to new features with complete indifference.
52 |
53 | Just because a language is elegant and well-designed doesn't mean that people will use it. CSS arrived later than HTML. Designers didn't spend the intervening years waiting patiently for a way to style their documents on the web. They used what was available to them.
54 |
55 | ###Killing it
56 |
57 | In 1996 David Siegel published a book entitled _Creating Killer Websites_. In it, he outlined a series of ingenious techniques for wrangling eye-catching designs out of the raw material of HTML.
58 |
59 | One technique involved using a transparent GIF, just one pixel by one pixel in size. If this was inserted into a page as an IMG element, but given precise values in its width and height attributes, designers could control the amount of whitespace in their designs.
60 |
61 | Another technique used the TABLE element. This element—along with its children TR and TD—was intended to describe tabular data. But with the right values applied to the widths and heights of table cells, it could be used to recreate just about any desired layout.
62 |
63 | These were hacks; clever solutions to tricky problems. But they had unfortunate consequences. Designers were treating HTML as a tool for the appearance of content instead of a language for describing the meaning of content. CSS was a solution to this problem, if only designers could be convinced to use it.
64 |
65 | ###Browser wars
66 |
67 | One of the reasons why web designers weren't using CSS was the lack of browser support. Back then there were two major browsers competing for the soul of the web: Microsoft Internet Explorer and Netscape Navigator. They were incompatible by design. One browser would invent a new HTML element or attribute. The other browser would invent their own separate element or attribute to do exactly the same thing.
68 |
69 | Perhaps the thinking behind this strategy was that web designers would have to choose which proprietary features they were going to get behind, like children being forced to choose between parents. In reality web designers had little choice but to write for both browsers which meant doing the twice the work.
70 |
71 | A group of web designers decided enough was enough. They gathered together under the banner of the Web Standards Project and began lobbying Microsoft and Netscape to abandon their proprietary ways and adopt standards such as CSS.
72 |
73 | The tide began to turn with the launch of Internet Explorer 5 for the Mac, a browser that shipped with impressive CSS support. If this was the future of web design, life was about to get a lot more productive and creative.
74 |
75 | Some forward-thinking web designers caught the CSS bug early. They redesigned their websites using CSS for layout instead of using TABLEs and spacer GIFs. True to the founding spirit of the web, they shared what they were learning and encouraged others to make the switch to CSS.
76 |
77 | Perhaps the best demonstration of the power of CSS was a website called the CSS Zen Garden, created by Dave Shea. It was a showcase of beautiful and varied designs, all of them accomplished with CSS. Crucially, the HTML remained the same.
78 |
79 | Seeing the same HTML document styled in a multitude of different ways drove home one of the beneficial effects of CSS: separation of concerns.
80 |
81 | ###Coupling
82 |
83 | In any system, from urban infrastructure to a computer programme, the designers of that system can choose the degree to which the pieces of the system depend on one another. In a tightly-coupled system, every piece depends on every piece. In a loosely-coupled system, all the pieces are independent, with little to no knowledge of the other pieces.
84 |
85 | In a tightly-coupled system, each part of the system can make assumptions about the other parts. These systems can be designed quite quickly, but at a price. They lack resilience. If one piece fails, it could take the whole system with it.
86 |
87 | Designing a loosely-coupled system can take more work. The payoff is that the overall result is more resilient to failure. Individual parts of the system can be swapped out with a minimum of knock-on effects.
88 |
89 | The hacks pioneered by David Siegel tightly coupled structure and presentation into a single monolithic HTML file. The adoption of CSS eased this dependency, bringing the web closer to the modular approach of the UNIX philosophy. The presentational information could be moved into a separate file: a style sheet. That's how a single HTML document at the CSS Zen Garden could have so many different designs applied to it.
90 |
91 | The style sheet still needs to have some knowledge of the HTML document's structure. Quite often, "hooks" are added into the markup to make it easier to style: specific values of class or id attributes, for example. So HTML and CSS aren't completely decoupled. They form a partnership but they also have an arrangement. The markup document might decide that it wants to try seeing other style sheets sometimes. Meanwhile, the style sheet could potentially be applied to other documents. They are loosely coupled.
92 |
93 | ##Dancing about architecture
94 |
95 | It takes time for a discipline to develop its own design values. Web design is a young discipline indeed. While we slowly begin to form our own set of guiding principles, we can look to other disciplines for inspiration.
96 |
97 | The world of architecture has accrued its own set of design values over the years. One of those values is the principle of material honesty. One material should not be used a substitute for another. Otherwise the end result is deceptive.
98 |
99 | Using TABLEs for layout is materially dishonest. The TABLE element is intended for marking up the structure of tabular data. The end result of using TABLEs, FONT elements, and spacer GIFs is a façade. At first glance everything looks fine, but it won't stand up to scrutiny. As soon as such a website is stress-tested by actual usage across a range of browsers, the façade crumbles.
100 |
101 | Using CSS for presentation is materially honest—that's the intended use of CSS. It also allows HTML to be materially honest. Instead of trying to fulfil two roles—structure and presentation—HTML can return to fulfilling its true purpose, marking up the meaning of content.
102 |
103 | It's still possible to use (or abuse) CSS to be materially dishonest. For the longest time, there was no easy way to add rounded corners to an element using CSS. Instead, web designers found ways to hack around the problem, putting background images on the element to simulate the same end effect. It worked up to a point, but just like the spacer GIF hack, it was a façade. Then the border-radius property arrived. Now designers can have their rounded corners in a materially honest way.
104 |
105 | Crucially, designers were able to use new properties like border-radius long before every web browser supported them. That's all thanks to the liberal error-handling model of CSS. Newer browsers would display the rounded corners. Older browsers would not throw an error. Older browsers would not stop parsing the CSS and refuse to parse any further. They would simply ignore the instructions they didn't understand and move on. No harm, no foul.
106 |
107 | Of course this means that the resulting website will look different in different browsers. Some people will see rounded corners. Others won't.
108 |
109 | And that's okay.
--------------------------------------------------------------------------------
/markdown/chapter04.md:
--------------------------------------------------------------------------------
1 | #Chapter 4: Languages
2 |
3 | Jon Postel was one of the engineers working on the ARPANET, the precursor to the internet. He wanted to make sure that the packets—or "datagrams"—being shuttled around the network were delivered in the most efficient way. He came to realise that a lax approach to errors was crucial to effective packet switching.
4 |
5 | If a node on the network receives a datagram that has errors, but is still understandable, then the packet should be processed anyway. Conversely every node on the network should attempt to send well-formed packets. This line of thinking was enshrined in the Robustness Principle, also known as Postel's Law:
6 |
7 | > Be conservative in what you send; be liberal in what you accept.
8 |
9 | If that sounds familiar, it's because that's the way that web browsers deal with HTML and CSS. Even if there are errors in the HTML or CSS, the browser will still attempt to process the information, skipping over any pieces that it can't parse.
10 |
11 | ##Declaration
12 |
13 | HTML and CSS are both examples of declarative languages. An author writing in a declarative language describes a desired outcome without providing step-by-step instructions to the computer processing the file. With HTML, you can describe the nature of the content—paragraphs, headings, form fields, etc.—without having to explain exactly what the browser should do with that information. With CSS, you can describe the desired appearance of the content—colours, borders, etc.—without having to write a program to apply those styles.
14 |
15 | Most programming languages are not declarative, they are imperative. Perl, Java, C++ ...these are all examples of imperative languages. If you're writing in one of those languages, you must provide precise instructions to the computer interpreting your code.
16 |
17 | Imperative languages provide you with more power and precision than declarative languages. That comes at a price. Imperative languages tend to be harder to learn than declarative languages. It's also harder to apply Postel's Law to imperative languages. If you make a single mistake—one misplaced comma or semi-colon—the entire program may fail.
18 |
19 | Imperative languages such as PHP, Ruby, and Python can be found on the servers powering the World Wide Web, reading and writing database records, processing input, and running complex algorithms. You can choose just about any programming language you want when writing server-side code. Unlike the unknowability of the end user's web browser, you have control over your server's capabilities.
20 |
21 | If you want to write imperative code that runs in a web browser, you only have one choice: JavaScript.
22 |
23 | ##Scripting
24 |
25 | The idea of executing a program from within a web page is as old as the web itself. Here's an early email to the www-talk mailing list:
26 |
27 | > I would like to know, whether anybody has extended WWW such, that it is possible to start arbitrary programs by hitting a button in a WWW browser.
28 |
29 | Tim Berners-Lee, creator of the World Wide Web, responded:
30 |
31 | > Very good question. The problem is that of programming language. You need something really powerful, but at the same time ubiquitous. Remember a facet of the web is universal readership. There is no universal interpreted programming language.
32 |
33 | That was in 1992. The universal interpreted programming language finally arrived in 1996. It was written in ten days by programmer at Netscape named Brendan Eich.
34 |
35 | The language went through a few name changes. First it was called Mocha. Then it was officially launched as LiveScript. Then the marketing department swept in and renamed it JavaScript, hoping that the name would ride the wave of hype associated with the then-new Java language. In truth, the languages have little in common. Java is to JavaScript as ham is to hamster.
36 |
37 | ##Patterns of progress
38 |
39 | JavaScript gave designers the power to update web pages even after they had loaded. Two common uses soon emerged: rollovers and form validation.
40 |
41 | Swapping out images when someone hovers their cursor over a link might not seem like a sensible use of a brand new programming language, but back in the nineties there was no other way of creating hover effects.
42 |
43 | Before JavaScript came along, a form would have to be submitted to a web server before you could check to make sure that all the required fields were filled in, or that the information that was entered corresponded to an expected format.
44 |
45 | Both of those use cases still exist, but now there's no need to reach for JavaScript. You can create rollover effects using the :hover pseudo-class in CSS. You can validate form fields using the required and type attributes in HTML.
46 |
47 | That's a pattern that repeats again and again: a solution is created in an imperative language and if it's popular enough, it migrates to a declarative language over time. When a feature is available in a declarative language, not only is it easier to write, it's also more robust.
48 |
49 | The loose error-handling of HTML and CSS means that any authoring mistakes or browser support issues are handled gracefully; the browser simply ignores what it doesn't understand and carries on. By contrast, if you give a browser some badly-formed JavaScript or attempt to use an unsupported JavaScript feature, not only will the browser throw an error, it will stop parsing the script at that point and refuse to go any further.
50 |
51 | ##Responsibility
52 |
53 | JavaScript gave web designers the power to create websites that were slicker, smoother, and more reactive. The same technology also gave web designers the power to create websites that were sluggish, unwieldy, and more difficult to use.
54 |
55 | One of the earliest abuses of JavaScript came (unsurprisingly) from the advertising industry, a business whose very raison d'être is often at odds with the goals of people trying to achieve a task as quickly as possible. JavaScript allows you to create new browser windows, something that previously could only be done by the user. A young developer named Ethan Zuckerman realised that he could spawn a new window with an advertisement in it. That allowed advertisers to put their message in front of website visitors. Not only that, but JavaScript could be used to spawn multiple windows, some of them visible, some of them hidden behind the current window. It was a fiendish solution.
56 |
57 | Twenty years later, Zuckerman wrote:
58 |
59 | > I wrote the code to launch the window and run an ad in it. I’m sorry.
60 |
61 | Pop-up (and pop-under) windows became so intolerable that browsers had to provide people with a means to block them.
62 |
63 | The advertising industry later found other ways to abuse JavaScript. Ad-supported online publishers injected bloated and inefficient JavaScript into their pages, making them slow to load. JavaScript was also used to track people from site to site. People reached for ad-blocking software to combat this treatment. Eventually ad blocking was built into browsers and operating systems to give us the ability to battle excessive JavaScript.
64 |
65 | Web designers would do well to remember what the advertising industry chose to ignore: on the web, the user has the final say.
66 |
67 | ##2.0
68 |
69 | The rise of JavaScript was boosted in 2005 with the publication of an article entitled _Ajax: A New Approach to Web Applications_ by Jesse James Garrett. The article put a name to a technique that was gaining popularity. Using a specific subset of JavaScript, it was possible for a web browser to send and receive data from a web server without refreshing the whole page. The result was a smoother user experience.
70 |
71 | The term Ajax was coined at the same time as another neologism was in the ascendent. Tim O'Reilly used the phrase Web 2.0 to describe a new wave of web products and services. Unlike Ajax, it was difficult to pin down a definition of Web 2.0. For business people, it meant new business models. For graphic designers, it meant rounded corners and gradients. For developers, it meant JavaScript and Ajax.
72 |
73 | Whatever its exact meaning, the term Web 2.0 captured a mood and a feeling. Everything was going to be different now. The old ways of thinking about building for the web could be cast aside. Treating the web as a limitless collection of hyperlinked documents was passé. The age of web apps was at hand.
74 |
75 | ##Appiness
76 |
77 | In the 1964 supreme court case Jacobellis versus Ohio, Justice Potter Stewart provided this definition of obscenity:
78 |
79 | > I know it when I see it.
80 |
81 | The same could be said for Web 2.0, or for the term "web app". We can all point to examples of web apps, but it's trickier to provide a definition for the term. Web apps allow people to create, edit, and delete content. But these tasks were common long before web apps arrived. People could fill in forms and submit them to a web server for processing. Ajax removed the necessity for that round trip to the server.
82 |
83 | Perhaps the definition of a web app requires some circular reasoning:
84 |
85 | * JavaScript is a requirement for a web app, and
86 | * a web app is a website that requires JavaScript to work.
87 |
88 | In that case, building web apps depends on a fundamental assumption: JavaScript must be available and reliable. But because of its imperative nature, JavaScript is inherently more fragile than a declarative language like HTML. Relying on JavaScript might not be such a safe assumption after all.
89 |
90 | ##Unforgiven
91 |
92 | HTML's loose error-handling allowed it to grow in power over time. It also ensured that the language was easy to learn. Even if you made a mistake, the browser's implementation of Postel's Law ensured that you'd still get a result. Surprisingly, there was an attempt to remove this superpower from HTML.
93 |
94 | After the standardisation of HTML version 4 in 1999, the World Wide Web Consortium published XHTML 1.0. This reformulated HTML according to the rules of the XML data format. Whereas HTML can have uppercase or lowercase tag names and attributes, XML requires them to be all lowercase. There were some other differences: all attributes had to be quoted, and standalone elements like IMG or BR required a closing slash.
95 |
96 | XHTML 1.0 didn't add any new features to the language. It was simply a stricter way of writing markup. XHTML 2.0 was a different proposition. Not only would it remove established elements like IMG, it would also implement XML's draconian error-handling model. If there is a single error in an XML document—one unquoted attribute or missing closing slash—then the parser should stop immediately and refuse to render anything.
97 |
98 | XHTML 2.0 died on the vine. Its theoretical purity was roundly rejected by the people who actually made websites for a living. Web designers rightly refused to publish in a format that would fail completely instead of trying to recover from an error.
99 |
100 | Strange then, that years later, web designers would happily create entire websites using JavaScript, a language that shares XML's unforgiving error-handling model. They didn't call them websites. They called them web apps. That distinction was cold comfort to someone who couldn't complete their task because a service relied on JavaScript for crucial functionality.
101 |
102 | Despite JavaScript's fragile error-handling model, web designers became more reliant on JavaScript over time. In 2015, NASA relaunched its website as a web app. If you wanted to read the latest news of the agency's space exploration efforts, you first had to download and execute three megabytes of JavaScript. This content—text and images—could have been delivered in the HTML, but the developers decided to use Ajax to retrieve this data instead. Until all that JavaScript was loaded, parsed, and executed, visitors to the site were left staring at a black background. Perhaps this was intended as a demonstration of the vast lonely emptiness of space.
103 |
104 | This highlights another difference between HTML and JavaScript. Whereas HTML can be rendered piece by piece as it is downloaded, a JavaScript file must be downloaded in its entirety before its contents can be parsed. While it's tempting to think that only a small minority of visitors will miss out on a site's JavaScript, the truth is that everybody is a non-JavaScript user until the JavaScript has finished loading ...*if* the JavaScript finishes loading. Flaky connections, interfering network operators, and unpredictable ad-blocking software can torpedo the assumption that JavaScript will always be available.
105 |
106 | The problem is not with people deliberately disabling JavaScript in their browsers. Although that's a use case worth considering, it's not the most common cause of JavaScript errors. Stuart Langridge put together a list of all the potential points of failure under the title _Everyone has JavaScript, right?_
107 |
108 | > The user requests your web app. Has the page loaded yet? Did the HTTP request for the JavaScript succeed? Did the HTTP request for the JavaScript complete? Does the corporate firewall block JavaScript? Does their ISP or mobile operator interfere with downloaded JavaScript? Have they switched off JavaScript? Do they have add-ons or plug-ins installed which inject script or alter the DOM in ways you didn't anticipate? Is the Content Delivery Network up? Does their browser support the JavaScript you've written?
109 |
110 | Many of those problems would also affect HTML and CSS files, but because of Postel's Law, they can recover gracefully.
111 |
112 | This doesn't mean that web designers shouldn't use JavaScript. But it does mean that web designers shouldn't *rely* on JavaScript when a simpler solution exists.
113 |
114 | ##Platform
115 |
116 | Web designers who ignored the message of John Allsopp's _A Dao of Web Design_ made the mistake of treating the web like print. The history of print has much to offer—hierarchy, typography, colour theory—but the web is a fundamentally different medium. The history of software development also has much to offer—architecture, testing, process—but again, the web remains its own medium.
117 |
118 | It's tempting to apply the knowledge and learnings from another medium to the web. But it is more structurally honest to uncover the web's own unique strengths and weaknesses.
119 |
120 | The language we use can subtly influence our thinking. In his book _Metaphors We Live By_, George Lakoff highlights the dangers of political language. Obvious examples are "friendly fire" and "collateral damage", but a more insidious example is "tax relief"—before a debate has even begun, taxation has been framed as something requiring relief.
121 |
122 | On the face of it, the term "web platform" seems harmless. Describing the web as a platform puts it on par with other software environments. Flash was a platform. Android is a platform. iOS is a platform. But the web is not a platform. The whole point of the web is that it is *cross*-platform.
123 |
124 | A platform provides a controlled runtime environment for software. As long as the user has that runtime environment, you can be confident that they will get exactly what you've designed. If you build an iOS app and someone has an iOS device, you know that they will get 100% of your software. But if you build an iOS app and someone has an Android device, they will get 0% of your software. You can't install an iOS app on an Android device. It's all or nothing.
125 |
126 | The web isn't as binary as that. If you build something using web technologies, and someone visits with a web browser, you can't be sure how many of the web technologies will be supported. It probably won't be 100%. But it's also unlikely to be 0%. Some people will visit with iOS devices. Others will visit with Android devices. Some people will get 80% or 90% of what you've designed. Others will get just 20%, 30%, or 50%. The web isn't a platform. It's a continuum.
127 |
128 | Thinking of the web as a platform is a category error. A platform like Flash, iOS, or Android provides stability and certainty, but only under a very specific set of circumstances—your software must be accessed with the right platform-specific runtime environment. The web provides no such certainty, but it also doesn't restrict the possible runtime environments.
129 |
130 | Platforms are controlled and predictable. The World Wide Web is chaotic and unpredictable.
131 |
132 | The web is a hot mess.
--------------------------------------------------------------------------------
/markdown/chapter05.md:
--------------------------------------------------------------------------------
1 | #Chapter 5: Layers
2 |
3 | In his classic book _How Buildings Learn_ Stewart Brand highlights an idea by the British architect Frank Duffy:
4 |
5 | > A building properly conceived is several layers of longevity.
6 |
7 | Duffy called these shearing layers. Each of the layers moves at a different timescale. Brand expanded on the idea, proposing six alliterative layers:
8 |
9 | 1. **Site**—the physical location of a building only changes on a geological timescale.
10 | 2. **Structure**—the building itself can last for centuries.
11 | 3. **Skin**—the exterior surface gets a facelift or a new lick of paint every few decades.
12 | 4. **Services**—the plumbing and wiring need to be updated every ten years or so.
13 | 5. **Space plan**—the layout of walls and doors might change occasionally.
14 | 6. **Stuff**—the arrangement of furniture in a room can change on a daily basis.
15 |
16 | The idea of shearing layers can also be applied to our creations on the web. Our domain names are the geological sites upon which we build. At the other end of the timescale, content on the web—the "stuff"—can be added and updated by the hour, the minute, or even the second. In between are the layers of structure, presentation, and behaviour: HTML, CSS, and JavaScript.
17 |
18 | Those layers can be loosely-coupled, but they aren't completely independent. Just as a building cannot have furniture without first having rooms and walls, a style sheet needs some markup to act upon. The coupling between structure and presentation is handled through selectors in CSS: element selectors, class selectors, and so on. With JavaScript, the coupling is handled through the vocabulary of the Document Object Model, or DOM.
19 |
20 | In a later book, _The Clock Of The Long Now_, Stewart Brand applied the idea of shearing layers—or pace layers—to civilisation itself. The slowest moving layer is nature itself, then there's culture, followed by governance, then infrastructure, and finally commerce and fashion as the fastest layers. In a loosely-coupled way, each layer depends on the layer below. In turn, the accumulation of each successive layer enables an "adjacent possible" filled with more opportunities.
21 |
22 | Likewise, the expressiveness of CSS and JavaScript is only made possible on a foundation of HTML, which itself requires a URL to be reachable, which in turn depends on the HyperText Transfer Protocol, which sits atop the bedrock of TCP/IP.
23 |
24 | Each of the web's shearing layers can be peeled back to reveal a layer below. Running that process in reverse—applying each layer in turn—is a key principle of resilient web design.
25 |
26 | ##Progressive enhancement
27 |
28 | In 2003, the South by Southwest festival in Austin, Texas was primarily an event for musicians and filmmakers. Today the music and film portions are eclipsed by the juggernaut of South by Southwest Interactive, dedicated to all things digital. In 2003, South by Southwest Interactive was a small affair, squeezed into one corner of one floor of the Austin Convention Center. It was a chance for a few web designers and bloggers to get together and share ideas. That year, Steven Champeon and Nick Finck presented a talk entitled _Inclusive Web Design For the Future with Progressive Enhancement_. They opened with this call to arms:
29 |
30 | > Web design must mature and accept the developments of the past several years, abandon the exclusionary attitudes formed in the rough and tumble dotcom era, realize the coming future of a wide variety of devices and platforms, and separate semantic markup from presentation logic and behavior.
31 |
32 | Like Tim Berners-Lee, Steven Champeon had experience of working with SGML, the markup language that would so heavily influence HTML. In dealing with documents that needed to be repurposed for different outputs, he came to value the separation of structure from presentation. A meaningfully marked up document can be presented in multiple ways through the addition of CSS and JavaScript.
33 |
34 | This layered approach to the web allows the same content to be served up to a wide variety of people. But this doesn't mean that everyone gets the same experience. Champeon realised that a strong separation of concerns would allow enhancements to be applied according to the capabilities of the end user's device.
35 |
36 | To paraphrase Karl Marx, progressive enhancement allows designers to ask from each browser according to its ability, and to deliver to each device according to its needs.
37 |
38 | ###Do websites need to look exactly the same in every browser?
39 |
40 | Some web designers were concerned that progressive enhancement would be a creative straitjacket. Designing for the lowest common denominator did not sound like a recipe for progress. But this was a misunderstanding. Progressive enhancement asks that designers *start* from the lowest common denominator (a well marked-up document), but there is no limit to where they can go from there.
41 |
42 | In fact, it's the very presence of a solid baseline of HTML that allows web designers to experiment with the latest and greatest CSS. Thanks to Postel's Law and the loose error-handling model of CSS, designers are free to apply styles that only work in the newest browsers.
43 |
44 | This means that not everyone will experience the same visual design. This is a feature of the web, not a bug. New browsers and old browsers; monochrome displays and multi-coloured displays; fast connections and slow connections; big screens, small screens, and no screens; everyone can access your content. That content *should* look different in such varied situations. If a website looks the same on a ten-year old browser as it does in the newest devices, then it probably isn't taking advantage of the great flexibility that the web offers.
45 |
46 | To emphasis this, designer Dan Cederholm created a website to answer the question, "Do websites need to look exactly the same in every browser?" You can find the answer to that question at the URL:
47 |
48 | dowebsitesneedtolookexactlythesameineverybrowser.com
49 |
50 | At the risk of spoiling the surprise for you, the answer is a resounding "No!" If you visit that website, you will see that answer proudly displayed. But depending on the capabilities of your browser, you may or may not some of the stylistic flourishes applied to that single-word answer. Even if you don't get *any* of the styles, you'll still get the content marked up with semantic HTML.
51 |
52 | ##Cutting the mustard
53 |
54 | Separating structure and presentation is relatively straightforward. You can declare whatever styles you want, safe in the knowledge that browsers will ignore what they don't understand. Separating structure and behaviour isn't quite so easy. If you give a browser some JavaScript that it doesn't understand, not only will it not apply the desired behaviour, it will refuse to parse the rest of the JavaScript.
55 |
56 | Before you use a particular feature in JavaScript, it's worth testing to see if that feature is supported. This kind of feature detection can save your website's visitors from having a broken experience because of an unsupported feature. If you want to use Ajax, check first that the browser supports the object you're about to use to enable that Ajax functionality. If you want to use the geolocation API, check first that the browser supports it.
57 |
58 | A team of web developers working on the BBC news website referred to this kind of feature detection as cutting the mustard. Browsers that cut the mustard get an enhanced experience. Browsers that don't cut the mustard still get access to the content, but without the JavaScript enhancements.
59 |
60 | Feature detection, cutting the mustard, whatever you want to call it, is a fairly straightforward technique. Let's say you want to traverse the DOM using querySelector and attach events to some nodes in the document using addEventListener. Your mustard-cutting logic might look something like this:
61 |
62 | if (document.querySelector && window.addEventListener) {
63 | // Enhance!
64 | }
65 |
66 | There are two points to note here:
67 |
68 | 1. This is feature detection, not browser detection. Instead of asking "which browser are you?" and trying to infer feature support from the answer, it is safer to simply ask "do you support this feature?"
69 | 2. There is no else statement.
70 |
71 | ##Aggressive enhancement
72 |
73 | Back when web designers were trying to exert print-like control over web browsers, a successful design was measured in pixel perfection: did the website look exactly the same in every browser?
74 |
75 | Unless every browser supported a particular feature—like, say, rounded corners in CSS—then that feature was off the table. Instead, designers would fake it with extra markup and images. The resulting websites lacked structural honesty. Not only was this a waste of talent and energy on the part of the designers, it was a waste of the capabilities of modern web browsers.
76 |
77 | The rise of mobiles, tablets, and responsive design helped to loosen this restrictive mindset. It is no longer realistic to expect pixel-perfect parity across devices and browsers. But you'll still find web designers bemoaning the fact that they have to support an older outdated browser because a portion of their audience are still using it.
78 |
79 | They're absolutely right. Anyone using that older browser should have access to the same content as someone using the latest and greatest web browser. But that doesn't mean they should get the same experience. As Brad Frost puts it:
80 |
81 | > There is a difference between support and optimization.
82 |
83 | *Support* every browser ...but *optimise* for none.
84 |
85 | Some designers have misunderstood progressive enhancement to mean that *all* functionality must be provided to *everyone*. It's the opposite. Progressive enhancement means providing *core* functionality to everyone. After that, it's every browser for itself. Far from restricting what features you can use, progressive enhancement provides web designers with a way to safely use the latest and greatest features without worrying about older browsers. Scott Jehl of the agency Filament Group puts it succinctly:
86 |
87 | > Progressive Enhancement frees us to focus on the costs of building features for modern browsers, without worrying much about leaving anyone out. With a strongly qualified codebase, older browser support comes nearly for free.
88 |
89 | Scott worked on the responsive website for the Boston Globe newspaper. His colleague at the time, Mat Marquis, noted:
90 |
91 | If a website is built using progressive enhancement then it's okay if a particular feature isn't supported or fails to load: Ajax, geolocation, whatever. As long as the core functionality is still available, web designers don't need to bend over backwards trying to crowbar support for newer features into older browsers.
92 |
93 | You also get a website that's more resilient to JavaScript's error-handling model. Mat Marquis worked alongside Scott Jehl on the responsive website for the Boston Globe. He noted:
94 |
95 | > Lots of cool features on the Boston Globe don’t work when JS breaks; “reading the news” is not one of them.
96 |
97 | The trick is identifying what it is considered core functionality and what is considered an enhancement.
--------------------------------------------------------------------------------
/markdown/chapter06.md:
--------------------------------------------------------------------------------
1 | #Chapter 6: Steps
2 |
3 | "Always design a thing by considering it in its next larger context", said the Finnish architect Eliel Saarinen. "A chair in a room, a room in a house, a house in an environment, an environment in a city plan."
4 |
5 | At first glance, web design appears to be a kind of graphic design. Using graphic design tools like Photoshop to design websites reinforces this view. But to get to the heart of a website's purpose, we should consider the interface in its larger context: what are people trying to accomplish?
6 |
7 | When designing for the web, it's tempting to think in terms of interactions like swiping, tapping, clicking, scrolling, dragging and dropping. But very few people wake up in the morning looking forward to a day of scrolling and tapping. They're more likely to think in terms of reading, writing, sharing, buying and selling. Web designers need to see past the surface-level actions to find the more meaningful verbs beneath.
8 |
9 | In their book _Designing With Progressive Enhancement_, the Filament Group describe a technique they call "the x-ray perspective":
10 |
11 | > Taking an x-ray perspective means looking "through" the complex widgets and visual styles of a design, identifying the core content and functional pieces that make up the page, and finding a simple HTML equivalent for each that will work universally.
12 |
13 | If you're not used to this approach to web design, it can take some getting used to. But after a while it becomes a habit and then it's hard *not* to examine interfaces in this way. It's like trying not to notice bad kerning, or tying not to see the arrow in the whitespace of the FedEx logo, or trying to not to remember that all ducks are actually wearing dog masks.
14 |
15 | Here's a three-step approach I take to web design:
16 |
17 | 1. Identify core functionality.
18 | 2. Make that functionality available using the simplest possible technology.
19 | 3. Enhance!
20 |
21 | Identifying core functionality might sound like it's pretty straightforward, and after a bit of practice, it is. But it can be tricky at first to separate what's truly necessary from what's nice to have.
22 |
23 | ##Information
24 |
25 | Let's say you're a news provider. That right there is the core functionality—to provide news. There are many, many other services you could also provide; interactive puzzles, realtime notifications, and more. Valuable as those services are, they're probably not as important as making sure that people have access to news.
26 |
27 | With that core functionality identified, it's time to move on to step two: how can you make that core functionality available using the simplest possible technology?
28 |
29 | Theoretically, a plain text file would be the simplest possible way of providing the news. But as we're talking specifically about the web, let's caveat this step: how can you make the core functionality available using the simplest possible **web** technology? That would be an HTML file served up at a URL.
30 |
31 | Even at this early stage it's possible to overcomplicate things. The HTML could be unnecessarily bloated. The URL could be unnecessarily verbose; hard to share or recall.
32 |
33 | Now that the news has been marked up with the appropriate HTML elements—articles, headings, paragraphs, lists, and images—it's time for step three: enhance!
34 |
35 | By default, the news will be presented using the browser's own stylesheet. It's legible, but not exactly pleasurable. By applying your own CSS, you can sculpt the content into a more pleasing shape. Whitespace, leading, colour and contrast are all at your disposal. You can even use custom fonts—an enhancement that was impossible on the web for many years.
36 |
37 | There's no guarantee that every browser will be capable of executing every CSS declaration that you throw at it. That's okay. Those browsers will ignore what they don't understand. Crucially, the news is still available to everyone, regardless of the CSS capabilities of their browser.
38 |
39 | For browsers on large-screen devices, you can introduce layout. It might seem odd at first to think of layout as an enhancement, but that's the lesson of mobile-first responsive design. Consider the content first, then mark it up with a sensible source order, then apply layout declarations within media queries.
40 |
41 | Thanks to ever-evolving nature of CSS, there are multiple ways of applying layout. Like Andy Tannenbaum said:
42 |
43 | > The nice thing about standards is that you have so many to choose from.
44 |
45 | ##Communication
46 |
47 | Applying the three-step process to a news site is relatively straightforward. Catching up with the news is a fairly passive act. To really test this process, we need to apply it to something more interactive.
48 |
49 | Suppose we were building a social network. The people using our tool need to be able to communicate with one another regardless of where in the world they are. The core functionality is sending and receiving messages.
50 |
51 | Displaying messages in a web browser isn't difficult. There might be a lot of complexity on the server involving databases, syncing, queueing, and load balancing, but the HTML needed to structure a reverse-chronological list isn't very different from the HTML needed for a news site.
52 |
53 | Sending a message from the browser to the web server requires HTML that is interactive. That's where forms come in. In this case, a form with a text input and a submit button should be enough, at least for the basic functionality.
54 |
55 | People can now receive and respond to messages on our social network, no matter what kind of device or browser they are using. Now the trick is to improve the experience without breaking that fundamental activity.
56 |
57 | If we were to leave the site in this HTML-only state, I don't think we'd be celebrating your company's IPO anytime soon. To really distinguish our service from the competition, we need that third step in the process: enhance!
58 |
59 | At the very least, we can apply the same logic we used for the news site and style our service. Using CSS we can provide colour, texture, contrast, web fonts, and for larger screens, layout. But let's not stop with the presentation. Let's improve the interaction too.
60 |
61 | Right now this social network has the same kind of page-based interaction as a news site. Every time someone sends a message to the server, the server sends back a whole new page to the browser. We can do better than that. Time for some Ajax.
62 |
63 | We can intercept the form submission and send the data to the server using Ajax—I like using the word Hijax to describe this kind of Ajax interception. If there's a response from the server, we can also update part of the current page instead of refreshing the whole page. This would also be a good time to introduce some suitable animation.
64 |
65 | We can go further. Browsers that support WebSockets can receive messages from the server. People using those browsers could get updates as soon as they've been sent. It's even possible to use peer-to-peer connections between browsers to allow people to communicate directly.
66 |
67 | Not every browser supports this advanced functionality. That's okay. The core functionality—sending and receiving messages—is still available to everyone.
68 |
69 | ##Creation
70 |
71 | What if our social network were more specialised? Let's make it a photo-sharing service. That raises the bar a bit. Instead of sending and receiving messages, the core functionality is now sending and receiving images.
72 |
73 | The interface needs to show a reverse-chronological list of images. HTML can handle that. Once again we need a form to send data to the server, but this time it needs to be a file upload instead of a text field.
74 |
75 | With those changes, the core functionality is in place. Time to enhance.
76 |
77 | As well as all the existing enhancements—CSS, web fonts, Ajax, WebSockets—we could make use of the File API introduced in HTML5. This allows us to manipulate the image directly in the browser. We could apply effects to the image before sending it to the server. Using CSS filters, we can offer a range of image enhancements from sepia tones to vignettes. But if a browser doesn't support the File API or CSS filters, people can still upload their duck-facing selfies.
78 |
79 | ##Collaboration
80 |
81 | There was a time when using software meant installing separate programs on your computer. Today it's possible to have a machine with nothing more than a web browser installed on it. Writing emails, looking up contact details, making calendar appointments, bookkeeping and other financial tasks can all be done without having to install bespoke applications. Instead, the act of visiting a URL can conjure up the tool you need when you need it.
82 |
83 | Delivering software over the web doesn't just replace the desktop-centric way of working. The presence of an internet connection opens up possibilities for all kinds of collaboration. Take, for example, the kinds of applications that were once called "word processors". As long as those programs were tethered to individual machines, trying to collaboratively edit a document was bound to be a tricky task requiring plenty of coordination, sending files back and forth. Using the web, the act of sharing a single URL could allow multiple people to work on the same document.
84 |
85 | Let's apply the three-step process to a web-based word processor:
86 |
87 | > Identify core functionality.
88 |
89 | The tautological answer would be "processing words." Not very helpful. What do people actual *do* with this software? They write. They share. They edit.
90 |
91 | > Make that functionality available using the simplest possible technology.
92 |
93 | Looking at our three verbs—writing, sharing, and editing—we get one of them for free just by using URLs: sharing. The other two—writing and editing—require the use of a form. A basic textarea element can act as the receptacle for the words, sentences, and paragraphs that will make up everything from technical reports to the great American novel. Submitting that content to a web server means it can be saved for later.
94 |
95 | Technically, that's a web-based word processor, accessible to anyone with a web browser and an internet connection. But the experience is clunky and dull. It would be a shame not to take advantage of some of slicker options available in modern browsers.
96 |
97 | > Enhance!
98 |
99 | Using JavaScript, the humble textarea can be replaced with a richer editing interface, detecting each keystroke and applying styling on the fly. Web fonts can make the writing experience more beautiful. Ajax will allow work to be saved to the server almost constantly, without the need for a form submission. WebSockets provide the means for multiple people to work on the same document at the same time.
100 |
101 | Both Ajax and WebSockets require an internet connection to work. There's no guarantee of a stable internet connection, especially if you're trying to work on train or in a hotel. Modern browsers provide features which, after the initial page load, can turn the network itself into an enhancement.
102 |
103 | If a browser supports some form of local storage, then data can be stored in a client-side database. Flaky network connections or unexpected power outages won't get in the way of saving that important document. Using Service Workers, web developers can provide instructions on what to do when the browser (or the server) is offline.
104 |
105 | These are modern browser features that we should be taking full advantage of ...once we've made sure that we're providing a basic experience for everyone.
106 |
107 | ##Scale
108 |
109 | Ward Cunningham, the creator of the wiki, coined the term "technical debt" to describe a common problem in the world of software. Decisions made in haste at the beginning of a project lead to a cascade of issues further down the line. I like to think of the three-step layered approach as a kind of "technical credit." Taking the time to provide core functionality at the beginning gives you the freedom to go wild with experimentation from then on.
110 |
111 | Some people have misunderstood progressive enhancement to mean foregoing the latest and greatest browser technologies, but in fact the opposite is true. Taking a layered approach to building on the web gives you permission to try cutting-edge JavaScript and APIs, regardless of how many or how few browsers currently implement them.
112 |
113 | We've looked at some examples of applying the three-step approach to a few products and services—news, social networking, photo sharing, and word processing. You can apply this approach to many more services: making and updating items in a to-do list, managing calendar appointments, looking up directions, making reservations at nearby restaurants. Each one can be built with the same process:
114 |
115 | 1. Identify core functionality.
116 | 2. Make that functionality available using the simplest possible technology.
117 | 3. Enhance!
118 |
119 | This approach works at different scales. It doesn't just work at the highest level of the service; it can also be applied at the level of individual URLs within.
120 |
121 | Ask "what is the core functionality of this URL?", make that functionality available using the simplest possible technology, and then enhance from there. This can really clarify which content is most important, something that's important in a mobile-first responsive workflow. Once you've established that, make sure that content is sent from the server as HTML (the simplest possible technology). Then, using conditional loading, you could decide to make Ajax requests for supporting content if the screen real-estate is available. For the URL of an individual news story, the story itself would be sent in the initial response, but related stories or comments could be pushed from the server only as needed (although you can still provide links to the related stories and comments for everyone).
122 |
123 | We can go deeper. We can apply the three-step process at the scale of individual components within a page. "What is the core functionality of this component? How can I make that functionality available using the simplest possible technology? Now how can I enhance it?"
124 |
125 | A component might be designed to be an all-singing all-dancing interactive map. With x-ray goggles, the core functionality reveals itself to be something much simpler: showing a location. Provide the address of that location in text: the simplest possible technology. Now you can enhance.
126 |
127 | It's worth remembering that enhancements can be provided on a sliding scale. The first enhancement for a text address might be to provide a static image. The next level up from that would be to swap out the static image with an interactive Ajax-powered slippy map. If a browser supports the geolocation API, you could show the distance to the location. Layer on some animations and transitions to help convey the directions better.
128 |
129 | Site navigation is another discrete component that lends itself well to a sliding scale of enhancements. The core functionality of navigation is to provide links to resources. The simplest—and still the best—technology to enable that is the humble hyperlink. A list of links should do the trick. With that in place, you are now free to enhance it into something really compelling. Off-canvas navigation, progressive disclosure, sliding, swiping, fading, expanding ...the sky's the limit.
130 |
131 | Because enhancements can be layered on according to the capabilities of each browser, it quickly becomes clear that this approach doesn't simply reduce down to having two versions of everything (the basic version and the enhanced version). Instead the service, the URLs, and the components you are designing could be experienced in any number of ways. And that's okay.
132 |
133 | Websites do not need to look exactly the same in every browser.
--------------------------------------------------------------------------------
/markdown/chapter07.md:
--------------------------------------------------------------------------------
1 | #Chapter 7: Challenges
2 |
3 | The sixth annual conference on hypertext took place in San Antonio, Texas in December 1991. Tim Berners-Lee's World Wide Web project was starting to take shape then. Thinking the conference organisers and attendees would appreciate the project, he submitted a proposal to Hypertext '91. The proposal was rejected.
4 |
5 | The hypertext community felt that the World Wide Web project was far too simplistic. Almost every other hypertext system included the concept of two-way linking. If a resource moved, any links pointing to that resource would update automatically. The web provided no such guarantees. Its system of linking was much simpler—you just link to something and that's it. To the organisers of Hypertext '91, this seemed hopelessly naive. They didn't understand that the simplicity of the web was actually its strength. Because linking was so straightforward, anyone could do it. That would prove to be crucial in the adoption and success of the World Wide Web.
6 |
7 | It's all-too tempting to quickly declare that an approach is naive, overly simplistic, and unrealistic. The idea that a website can simultaneously offer universal access to everyone while also providing a rich immersive experience for more capable devices ...that also seems hopelessly naive.
8 |
9 | This judgement has been handed down many times over the history of the web.
10 |
11 | ##"This is too simple"
12 |
13 | When the Web Standards Project ran its campaign encouraging designers to switch from TABLEs for layout to CSS, it was met with resistance. Time and time again they were criticised for their naivety. "Sure, a CSS-based layout might be fine for a simple personal site but there's no way it could scale to a large complex project."
14 |
15 | Then Doug Bowman spearheaded the CSS-based redesign of Wired.com and Mike Davidson led the CSS-based redesign of ESPN.com. After that the floodgates opened.
16 |
17 | When Ethan Marcotte demonstrated the power of responsive design, it was met with resistance. "Sure, a responsive design might work for a simple personal site but there's no way it could scale to a large complex project."
18 |
19 | Then the Boston Globe launched its responsive site. Microsoft made their homepage responsive. The floodgates opened again.
20 |
21 | It's a similar story today. "Sure, progressive enhancement might work for a simple personal site, but there's no way it could scale to a large complex project."
22 |
23 | The floodgates are ready to open. We just need you to create the poster child for resilient web design.
24 |
25 | ##"This is too difficult"
26 |
27 | Building resilient websites is challenging. It takes time to apply functionality and features in a considered layered way. The payoff is a website that can better react to unexpected circumstances—unusual browsers, flaky network connections, outdated devices. Nonetheless, for many web designers, the cost in time seems to be too high.
28 |
29 | It's worth remembering that building with progressive enhancement doesn't mean that *everything* needs to be made available to *everyone*. Instead it's the core functionality that counts. If every single feature needed to be available to every browser on every device, that would indeed be an impossibly arduous process. This is why prioritisation is so important. As long as the core functionality is available using the simplest possible technology, you can—with a clear conscience—layer on more advanced features.
30 |
31 | Still, it's entirely possible that this approach will result in duplicated work. If you build an old-fashioned client-server form submission process and then enhance it with JavaScript, you may end up repeating the form-processing on the client as well as the server. That can be mitigated if you are also using JavaScript on the server. It's theoretically possible to write universal JavaScript so that the server and browser share a single codebase. Even without universal JavaScript, I still think it's worth spending time to create technical credit.
32 |
33 | The UK's Government Service design manual provides an even shorter form of the three-step process I've outlined:
34 |
35 | 1. First, just make it work
36 | 2. Second, make it work better
37 |
38 | The design manual also explains why:
39 |
40 | > If you build pages with the idea that parts other than HTML are optional, you’ll create a better and stronger web page.
41 |
42 | This kind of resilience means that the time you spend up-front is well invested. You might also notice an interesting trend; the more you use this process, the less time it will take.
43 |
44 | Moving from TABLEs to CSS seemed like an impossibly idealistic goal. Designers were comfortable using TABLE and FONT elements for layout. Why would they want to learn a whole new way of working? I remember how tricky it was to make my first CSS-based layouts after years of using hacks. It took me quite some time. But my second CSS-based layout didn't take quite so long. After a while, it became normal.
45 |
46 | Designers comfortable with fixed-width layouts had a hard time with responsive design. That first flexible layout was inevitably going to take quite a while to build. But the second flexible layout didn't take quite so long. After a while, it became normal.
47 |
48 | It's no different with the layered approach needed for building resilient websites. If you're not used to working this way, the first time you do it will take quite some time. But the second time won't take quite so long. After a while, it will become normal.
49 |
50 | ##"How do I convince...?"
51 |
52 | The brilliant computer scientist Grace Hopper kept an unusual timepiece on her wall. It ran counter-clockwise. When quizzed on this, she pointed out that it was an arbitrary convention, saying:
53 |
54 | > Humans are allergic to change. They love to say, "We've always done it this way." I try to fight that. That's why I have a clock on my wall that runs counter-clockwise.
55 |
56 | Behaviour change is hard. Even if you are convinced of the benefits of a resilient approach to web design, you may find yourself struggling to convince your colleagues, your boss, or your clients. It was ever thus. Take comfort from the history of web standards and responsive design. Those approaches were eventually adopted by people who were initially resistant.
57 |
58 | Demonstrating the benefits of progressive enhancement can be tricky. Because the payoff happens in unexpected circumstances, the layered approach is hard to sell. Most people will never even know whether or not a site has been built in a resilient way. It's a hidden mark of quality that will go unnoticed by people with modern browsers on new devices with fast network connections.
59 |
60 | For that same reason, you can start to implement this layered approach without having to convince your colleagues, your boss, or your clients. If they don't care, then they also won't notice. As Grace Hopper also said, "it's easier to ask forgiveness than it is to get permission."
61 |
62 | ###Tools
63 |
64 | Changing a workflow or a process can be particularly challenging if it clashes with the tools being used. A tool is supposed to help people get their work done in a more efficient way. The tool should be subservient to the workflow. All too often, tools instead dictate a preferred way of working. Whether it's WYSIWYG editors, graphic design programs, content management systems, or JavaScript frameworks, tools inevitably influence workflows.
65 |
66 | If you are aware of that influence, and you can recognise it, then you are in a better position to pick and choose the tools that will work best for you. There are many factors that play into the choice of frameworks, for example: "Is it well-written?", "Does it have an active community behind it?", "Does it have clear documentation?". But perhaps the most important question to ask is, "Does its approach match my own philosophy?"
67 |
68 | Every framework has a philosophy because every framework was written by people. If your philosophy aligns with that of the framework, then it will help you work more efficiently. But if your philosophy clashes with that of the framework, you will be fighting it every step of the way. It might even be tempting to just give up and let the framework dictate your workflow. Then the tail would be wagging the dog.
69 |
70 | Choose your tools wisely. It would be a terrible shame if you abandoned the resilient approach to web design because of a difference of opinion with a piece of software.
71 |
72 | Differences of opinion often boil down to a mismatch in priorities. At its heart, the progressive enhancement approach prioritises the needs of people, regardless of their technology. Tools, frameworks, and code libraries, on the other hand, are often built to prioritise the needs of designers and developers. That's not a bad thing. Developer convenience is hugely valuable. But speaking personally, I think that user needs should trump developer convenience.
73 |
74 | When I'm confronted with a problem, and I have the choice of making it the user's problem or my problem, I'll make it my problem every time. That's my job.
75 |
76 | ###Future friendly
77 |
78 | In September of 2011, I was speaking at a conference in Tennessee along with some people much smarter than me. Once the official event was done, we lit out for the countryside where we had rented a house for a few days. We were gathering together to try to figure out where the web was headed.
79 |
80 | We were, frankly, freaked out. The proliferation of mobile devices had changed everything. Tablets were on the rise. People were talking about internet TVs. We were hoping to figure out what the next big thing would be. Internet-enabled fridges, perhaps?
81 |
82 | In the end, the only thing we could be certain of was uncertainty:
83 |
84 | > Disruption will only accelerate. The quantity and diversity of connected devices—many of which we haven't imagined yet—will explode, as will the quantity and diversity of the people around the world who use them.
85 |
86 | That isn't cause for despair; it's cause for celebration. We could either fight this future or embrace it. Realising that it was impossible to be future-proof, we instead resolved to be future-friendly:
87 |
88 | 1. Acknowledge and embrace unpredictability.
89 | 2. Think and behave in a future-friendly way.
90 | 3. Help others do the same.
91 |
92 | That first step is the most important: acknowledging and embracing unpredictability. That is the driving force behind resilient web design. The best way to be future-friendly is to be backwards-compatible.
93 |
94 | ###Assumptions
95 |
96 | "We demand rigidly-defined areas of doubt and uncertainty!" cried the philosophers in Douglas Adams' _Hitchhiker's Guide To The Galaxy_.
97 |
98 | As pattern-matching machines, we are quick to identify trends and codify them into assumptions. Here are just some assumptions that were made over the history of web design:
99 |
100 | * Everyone has a monitor that is 640 pixels wide.
101 | * Everyone has the Flash plugin installed.
102 | * Everyone has a monitor that is 800 pixels wide.
103 | * Everyone has a mouse and a keyboard.
104 | * Everyone has a monitor that is 1024 pixels wide.
105 | * Everyone has a fast internet connection.
106 |
107 | The proliferation of mobile devices blew those assumptions out of the water. The rise of mobile didn't create new uncertainties—instead it shone a light on the uncertainties that already existed.
108 |
109 | That should have been a valuable lesson. But before too long the old assumptions were replaced with new ones:
110 |
111 | * There are some activities that people will never want to do on their phones.
112 | * Every phone has a touch screen.
113 | * Everyone using a phone is in a hurry.
114 | * Every browser on every phone supports JavaScript.
115 |
116 | Assumptions are beguiling. If only we could agree on certain boundaries, then wouldn't web design be so much easier to control?
117 |
118 | As tempting as this siren call is, it obfuscates the true nature of the ever-changing uncertain web. Carl Sagan put it best in his book _The Demon-Haunted World_:
119 |
120 | > It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring.
121 |
122 | ##The future
123 |
124 | I wish I could predict the future. The only thing that I can predict for sure is that things are going to change.
125 |
126 | I don't know what kind of devices people will be using on the web. I don't know what kind of software people will be using on the web.
127 |
128 | The future, like the web, is unknown.
129 |
130 | The future, like the web, will be written by you.
131 |
--------------------------------------------------------------------------------
/offline/index.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 | Resilient Web Design
10 |
11 |
12 |
13 |
14 |
15 |
16 |
17 |
18 |
19 |
20 |
21 |
22 |
23 |
24 | Resilient Web Design
25 |
26 |
27 |
28 |
29 |
Offline
30 |
31 |
Sorry.
32 |
33 |
There seems to be a problem with the network.
34 |
35 |
36 |
37 |
41 |
42 |
43 |
44 |
45 |
--------------------------------------------------------------------------------
/podcast.rss:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 | Resilient Web Design
5 | A web book by Jeremy Keith.
6 | Licensed under a Creative Commons Attribution‐ShareAlike 4.0 International License.
7 |
8 | Resilient Web Design
9 | https://resilientwebdesign.com
10 | https://resilientwebdesign.s3.amazonaws.com/podcast/artwork.png
11 |
12 | en
13 | https://resilientwebdesign.com
14 |
15 | Jeremy Keith
16 | A web book by Jeremy Keith.
17 | The podcast of the web book by Jeremy Keith.
18 |
19 | Jeremy Keith
20 | jeremy@adactio.com
21 |
22 |
23 |
24 |
25 |
26 | no
27 |
28 |
29 | Chapter 7: Challenges
30 | The fourth annual conference on hypertext took place in San Antonio, Texas in December 1991. Tim Berners‐Lee’s World Wide Web project was starting to take shape then. Thinking the conference organisers and attendees would appreciate the project, he submitted a proposal to Hypertext ’91. The proposal was rejected.
31 | https://resilientwebdesign.com/chapter7/
32 | https://resilientwebdesign.com/chapter7/
33 |
34 | Mon, 6 Feb 2017 17:00:00 GMT
35 | Jeremy Keith
36 | no
37 |
38 | 14:35
39 |
40 |
41 | Chapter 6: Steps
42 | “Always design a thing by considering it in its next larger context”, said the Finnish architect Eliel Saarinen. “A chair in a room, a room in a house, a house in an environment, an environment in a city plan.”
43 | https://resilientwebdesign.com/chapter6/
44 | https://resilientwebdesign.com/chapter6/
45 |
46 | Sun, 29 Jan 2017 17:00:00 GMT
47 | Jeremy Keith
48 | no
49 |
50 | 17:13
51 |
52 |
53 | Chapter 5: Layers
54 | In his classic book How Buildings Learn Stewart Brand highlights an idea by the British architect Frank Duffy: “A building properly conceived is several layers of longevity.”
55 | https://resilientwebdesign.com/chapter5/
56 | https://resilientwebdesign.com/chapter5/
57 |
58 | Sun, 22 Jan 2017 17:30:00 GMT
59 | Jeremy Keith
60 | no
61 |
62 | 12:20
63 |
64 |
65 | Chapter 4: Languages
66 | Jon Postel was one of the engineers working on the ARPANET, the precursor to the internet. He wanted to make sure that the packets—or “datagrams”—being shuttled around the network were delivered in the most efficient way. He came to realise that a lax approach to errors was crucial to effective packet switching.
67 | https://resilientwebdesign.com/chapter4/
68 | https://resilientwebdesign.com/chapter4/
69 |
70 | Sun, 15 Jan 2017 16:00:00 GMT
71 | Jeremy Keith
72 | no
73 |
74 | 19:18
75 |
76 |
77 | Chapter 3: Visions
78 | Design adds clarity. Using colour, typography, hierarchy, contrast, and all the other tools at their disposal, designers can take an unordered jumble of information and turn it into something that’s easy to use and pleasurable to behold. Like life itself, design can win a small victory against the entropy of the universe, creating pockets of order from the raw materials of chaos.
79 | https://resilientwebdesign.com/chapter3/
80 | https://resilientwebdesign.com/chapter3/
81 |
82 | Sun, 8 Jan 2017 16:00:00 GMT
83 | Jeremy Keith
84 | no
85 |
86 | 25:41
87 |
88 |
89 | Chapter 2: Materials
90 | At the risk of teaching grandmother to suck eggs, I’d like you to think about what happens when a browser parses an HTML element. Take, for example, a paragraph element with some text inside it. There’s an opening P tag, a closing P tag, and between those tags, there’s the text.
91 | https://resilientwebdesign.com/chapter2/
92 | https://resilientwebdesign.com/chapter2/
93 |
94 | Mon, 26 Dec 2016 12:00:00 GMT
95 | Jeremy Keith
96 | no
97 |
98 | 14:50
99 |
100 |
101 | Chapter 1: Foundations
102 | The history of human civilisation is a tale of cumulative effort. Each generation builds upon the work of their forebears. Sometimes the work takes a backward step. Sometimes we wander down dead ends. But we struggle on. Bit by bit our species makes progress. Whether the progress is incremental or a huge leap forward, it is always borne upon the accomplishments of those who came before us.
103 | https://resilientwebdesign.com/chapter1/
104 | https://resilientwebdesign.com/chapter1/
105 |
106 | Fri, 16 Dec 2016 09:00:00 GMT
107 | Jeremy Keith
108 | no
109 |
110 | 14:13
111 |
112 |
113 | Introduction
114 | With a title like Resilient Web Design, you might think that this is a handbook for designing robust websites. This is not a handbook. It’s more like a history book.
115 | https://resilientwebdesign.com/introduction/
116 | https://resilientwebdesign.com/introduction/
117 |
118 | Sun, 4 Dec 2016 12:00:00 GMT
119 | Jeremy Keith
120 | no
121 |
122 | 01:53
123 |
124 |
125 |
126 |
--------------------------------------------------------------------------------
/serviceworker.js:
--------------------------------------------------------------------------------
1 | 'use strict';
2 |
3 | // Import Jake's polyfill for async waitUntil
4 | importScripts('/js/async-waituntil.js');
5 |
6 | const version = 'v0.045::';
7 | const staticCacheName = version + 'static';
8 |
9 | function updateStaticCache() {
10 | return caches.open(staticCacheName)
11 | .then( cache => {
12 | // These items won't block the installation of the Service Worker
13 | cache.addAll([
14 | '/fonts/etbookot-italic-webfont.woff2',
15 | '/fonts/etbookot-bold-webfont.woff2',
16 | '/chapter1/images/small/enquire-within-upon-everything.jpg',
17 | '/chapter1/images/small/information-management.jpg',
18 | '/chapter1/images/small/line-mode-browser.jpg',
19 | '/chapter1/images/small/sir-tim-berners-lee.jpg',
20 | '/chapter1/images/small/submarine-cable-map.jpg',
21 | '/chapter1/images/small/sundial.jpg',
22 | '/chapter1/images/small/vague-but-exciting.jpg',
23 | '/chapter1/images/medium/enquire-within-upon-everything.jpg',
24 | '/chapter1/images/medium/information-management.jpg',
25 | '/chapter1/images/medium/line-mode-browser.jpg',
26 | '/chapter1/images/medium/sir-tim-berners-lee.jpg',
27 | '/chapter1/images/medium/submarine-cable-map.jpg',
28 | '/chapter1/images/medium/sundial.jpg',
29 | '/chapter1/images/medium/vague-but-exciting.jpg',
30 | '/chapter1/images/large/enquire-within-upon-everything.jpg',
31 | '/chapter1/images/large/information-management.jpg',
32 | '/chapter1/images/large/line-mode-browser.jpg',
33 | '/chapter1/images/large/sir-tim-berners-lee.jpg',
34 | '/chapter1/images/large/submarine-cable-map.jpg',
35 | '/chapter1/images/large/sundial.jpg',
36 | '/chapter1/images/large/vague-but-exciting.jpg',
37 | '/chapter2/images/small/zengarden.png',
38 | '/chapter2/images/medium/zengarden.png',
39 | '/chapter2/images/large/zengarden.png',
40 | '/chapter3/images/small/book-of-kells.jpg',
41 | '/chapter3/images/small/gutenberg-bible.jpg',
42 | '/chapter3/images/small/iphone.jpg',
43 | '/chapter3/images/small/jan-tchichold-medieval-manuscript-framework.png',
44 | '/chapter3/images/medium/book-of-kells.jpg',
45 | '/chapter3/images/medium/gutenberg-bible.jpg',
46 | '/chapter3/images/medium/iphone.jpg',
47 | '/chapter3/images/medium/jan-tchichold-medieval-manuscript-framework.png',
48 | '/chapter3/images/large/book-of-kells.jpg',
49 | '/chapter3/images/large/gutenberg-bible.jpg',
50 | '/chapter3/images/large/iphone.jpg',
51 | '/chapter3/images/large/jan-tchichold-medieval-manuscript-framework.png',
52 | '/chapter4/images/small/jon-postel.jpg',
53 | '/chapter4/images/small/nasa.png',
54 | '/chapter4/images/medium/jon-postel.jpg',
55 | '/chapter4/images/medium/nasa.png',
56 | '/chapter4/images/large/jon-postel.jpg',
57 | '/chapter4/images/large/nasa.png',
58 | '/chapter5/images/small/shearing-layers.jpg',
59 | '/chapter5/images/medium/shearing-layers.jpg',
60 | '/chapter5/images/large/shearing-layers.jpg',
61 | '/chapter6/images/small/duck.jpg',
62 | '/chapter6/images/small/news.png',
63 | '/chapter6/images/small/photosharing.png',
64 | '/chapter6/images/small/socialnetworks.png',
65 | '/chapter6/images/small/writing.png',
66 | '/chapter6/images/medium/duck.jpg',
67 | '/chapter6/images/medium/news.png',
68 | '/chapter6/images/medium/photosharing.png',
69 | '/chapter6/images/medium/socialnetworks.png',
70 | '/chapter6/images/medium/writing.png',
71 | '/chapter6/images/large/duck.jpg',
72 | '/chapter6/images/large/news.png',
73 | '/chapter6/images/large/photosharing.png',
74 | '/chapter6/images/large/socialnetworks.png',
75 | '/chapter6/images/large/writing.png',
76 | '/chapter7/images/small/devices.jpg',
77 | '/chapter7/images/small/grace-hopper.jpg',
78 | '/chapter7/images/small/future-friendly.png',
79 | '/chapter7/images/medium/devices.jpg',
80 | '/chapter7/images/medium/grace-hopper.jpg',
81 | '/chapter7/images/medium/future-friendly.png',
82 | '/chapter7/images/large/devices.jpg',
83 | '/chapter7/images/large/grace-hopper.jpg',
84 | '/chapter7/images/large/future-friendly.png',
85 | '/author/images/small/jeremykeith.jpg',
86 | '/author/images/medium/jeremykeith.jpg',
87 | '/author/images/large/jeremykeith.jpg',
88 | '/favicon.ico',
89 | '/manifest.json'
90 | ]);
91 | // These items must be cached for the Service Worker to complete installation
92 | return cache.addAll([
93 | '/',
94 | '/offline/',
95 | '/introduction/',
96 | '/chapter1/',
97 | '/chapter2/',
98 | '/chapter3/',
99 | '/chapter4/',
100 | '/chapter5/',
101 | '/chapter6/',
102 | '/chapter7/',
103 | '/author/',
104 | '/contents/',
105 | '/theindex/',
106 | '/css/styles.css',
107 | '/js/scripts.js',
108 | '/fonts/etbookot-roman-webfont.woff2'
109 | ]);
110 | });
111 | }
112 |
113 | // Remove caches whose name is no longer valid
114 | function clearOldCaches() {
115 | return caches.keys()
116 | .then( keys => {
117 | return Promise.all(keys
118 | .filter(key => key.indexOf(version) !== 0)
119 | .map(key => caches.delete(key))
120 | );
121 | });
122 | }
123 |
124 | self.addEventListener('install', event => {
125 | event.waitUntil(
126 | updateStaticCache()
127 | .then( () => self.skipWaiting() )
128 | );
129 | });
130 |
131 | self.addEventListener('activate', event => {
132 | event.waitUntil(
133 | clearOldCaches()
134 | .then( () => self.clients.claim() )
135 | );
136 | });
137 |
138 | self.addEventListener('fetch', event => {
139 | let request = event.request;
140 | // Look in the cache first, fall back to the network
141 | event.respondWith(
142 | // CACHE
143 | caches.match(request)
144 | .then( responseFromCache => {
145 | // Did we find the file in the cache?
146 | if (responseFromCache) {
147 | // If so, fetch a fresh copy from the network in the background
148 | // (using the async waitUntil polyfill)
149 | event.waitUntil(
150 | // NETWORK
151 | fetch(request)
152 | .then( responseFromFetch => {
153 | // Stash the fresh copy in the cache
154 | caches.open(staticCacheName)
155 | .then( cache => {
156 | cache.put(request, responseFromFetch);
157 | });
158 | })
159 | );
160 | return responseFromCache
161 | }
162 | // NETWORK
163 | // If the file wasn't in the cache, make a network request
164 | return fetch(request)
165 | .then( responseFromFetch => {
166 | // Stash a fresh copy in the cache in the background
167 | // (using the async waitUntil polyfill)
168 | let responseCopy = responseFromFetch.clone();
169 | event.waitUntil(
170 | caches.open(staticCacheName)
171 | .then( cache => {
172 | cache.put(request, responseCopy);
173 | })
174 | );
175 | return responseFromFetch;
176 | })
177 | .catch( () => {
178 | // OFFLINE
179 | // If the request is for an image, show an offline placeholder
180 | if (request.headers.get('Accept').includes('image')) {
181 | return new Response(
182 | '',
183 | {
184 | headers: {
185 | 'Content-Type': 'image/svg+xml',
186 | 'Cache-Control': 'no-store'
187 | }
188 | }
189 | );
190 | }
191 | // If the request is for a page, show an offline message
192 | if (request.headers.get('Accept').includes('text/html')) {
193 | return caches.match('/offline/');
194 | }
195 | })
196 | })
197 | );
198 | });
199 |
--------------------------------------------------------------------------------
/theindex/index.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
10 |
11 |
12 |
13 |
14 |
15 |
16 | Resilient Web Design—Index
17 |
18 |
19 |
20 |
21 |
22 |
23 |
24 |
25 |
26 |
27 |
28 |
29 |
30 | Resilient Web Design
31 |
32 |
33 |
34 |
35 |
Index
36 |
37 | A
38 | B
39 | C
40 | D
41 | E
42 | F
43 | G
44 | H
45 | I
46 | J
47 | K
48 | L
49 | M
50 | N
51 | O
52 | P
53 | Q
54 | R
55 | S
56 | T
57 | U
58 | V
59 | W
60 | X
61 | Y
62 | Z
63 |
64 |