100 Ideas That Changed The Web
From the mouse to the GIF, by way of the long tail and technology’s forgotten female
pioneers.
In his now-iconic 1945 essay “As We May Think,”
Vannevar Bush considered the problem of organizing humanity’s
knowledge, which he poetically termed “the common record,” in an
intelligent way amidst an era of information overload. It was a
challenge first addressed a decade earlier by a Belgian idealist named
Paul Otlet, whose global knowledge network called the Mundaneum
sparked the dawn of the modern information age. But it wasn’t until
1999 that Tim Berners-Lee, who had invented the World Wide Web and
launched the first webpage on August 6, 1991, coined the concept of the
Semantic Web — a seminal stride toward cultivating wisdom in the age of information,
bringing full-circle Otlet’s vision for an intelligent global network
of organizing human knowledge. Much like Johannes Gutenberg, who combined a number of existing technologies
to invent his revolutionary press, Berners-Lee was simply bringing
together disjointed technologies — electronic documents, hypertext,
markup, the internet — to create a new paradigm that changed our world
at least as much as Gutenberg’s invention. But how, exactly, did we get
there?
The 98 landmark technologies and ideas that bridged Otlet’s vision with Berners-Lee’s world-changing web are what digital archeologist Jim Boulton chronicles in 100 Ideas that Changed the Web (public library) — the latest installment in a fantastic series of cultural histories by British indie powerhouse Laurence King, including 100 Ideas that Changed Graphic Design, 100 Ideas that Changed Film, 100 Ideas that Changed Architecture, 100 Ideas that Changed Photography, and 100 Ideas that Changed Art.
The hundred ideas range from revolutionary concepts, like the
personal computer (#9), open source (#28), and peer-to-peer networks
(#62), to technologies so rudimentary and ubiquitous that we forget they
were once mere “ideas” in a world without them, like graphical user
interface (#5), search (#26), email (#51), and the internet itself
(#10), to cultural phenomena like the bulletin board systems (#12) that
geeks used to connect with one another 30 years before Facebook or
online dating (#78), which we still approach with an ambivalent blend of
skepticism, eagerness and, on very rare occasions,
absolute ingenuity. Boulton’s point, however, is to illustrate how even
the most humble among them — like, say, the dear old GIF (#18) — served
as combinatorial building blocks that contributed to the web as we
know, use, and love it.
Tucked into the various chapters are factlets that reveal delightful
and often surprising details about elements of digital communication
we’ve come to take for granted. For instance, the section on the
emoticon (#19) — which made its debut in 1881 and is also among the 100 diagrams that changed the world
— Boulton explains that telegraph operators used early examples of
type-based sentiment: “73” meant “best regards” and “88” love and
kisses.
He writes in the introduction:
Exploring the history of the Web is not just a nostalgic trip into our recent digital past but an exploration of the very way we think and communicate. Our thought processes are non-linear and erratic but the way we make sense of things and express ourselves is linear. Pioneers like Paul Otlet, Vannevar Bush, Theodor Nelson, Douglas Engelbart and Tim Berners-Lee questioned this conviction. Their legacy is the World Wide Web. A place that breaks down national and cultural borders. A place that blurs the boundaries between generating and exchanging ideas. A place that toppled regimes and created new economic models. A place that has radically changed the way we work, play, shop, socialize and otherwise participate in society. But above all, a place that is for everyone.
The internet, which predates the web by decades, has somewhat
unlikely beginnings. (Boulton makes a lucid, charmingly indignant
distinction between the two: “The terms “World Wide Web” and
“internet” are often used interchangeably, which is plain wrong. The
internet is a global system of interconnected computer networks. It is
the infrastructure that carries email, instant messaging, Voiceover, IP,
network games, file transfer and, of course, the Web.”) In the quest to win the Space Race
during the Cold War, the U.S. government established ARPA — the
Advanced Research Projects Agency — with grand ambitions, including the
creation of an Intergalactic Computer Network. On October 29, 1969,
researchers combined ARPA’s three major computing projects — a
communications system that could survive a nuclear attack, a computer
time-sharing concept, and an operating system — to successfully connect
computers between three different universities, creating the world’s
first packet-switching network. Known as ARPANET, it was a manifestation
of the vision for an Intergalactic Computer Network, which is
essentially what we know as the internet.
Even though the first successful packet-switching network was
established in 1969, different such networks around the world operated
by different rules and thus could not communicate with one another. In
the 1970s, Robert Kahn and Vint Cerf set out to establish a common
protocol, which became known as Transfer Control Protocol / Internet
Protocol, or TCP/IP. After a successful test was conducted between
networks in the U.S., U.K. and Norway in 1977, all packet-switching
networks were given a deadline of January 1, 1983, to migrate to the new
protocol. Boulton cites Vint Cerf, father of the internet:
When the day came, the main emotion was relief. There were no grand celebrations — I can’t even find a photograph. Yet, with hindsight, it’s obvious it was a momentous occasion. On that day, the operational internet was born.
One of the book’s most heartening touches is Boulton’s effort to shed
light on the web’s little-known female pioneers, from Hollywood star
Hedy Lamarr, who was once considered the most beautiful woman in the
world, starred in cinema’s first on-screen orgasm, and also invented the technology that laid the groundwork for bluetooth and wifi, to the very first photo uploaded to the web thanks to an all-girl science rock band from CERN, no less.
Not all ideas are technologies — many are higher-order concepts that
describe cultural phenomena and social dynamics. Among them is the
notion of “the long tail,” a term from statistics that Chris Anderson
popularized as a lens on business and creative culture in his excellent
2006 book of the same title. (I, of course, am partial — Brain Pickings is made possible entirely by the “long tail” of patrons like you.)
Fittingly, in the section on infographics (#68), Boulton traces the evolution of this visual communication genre from Otto Neurath’s invention of pictograms in the 1930s to the impact of data visualization pioneer Edward Tufte to the work of information designers like David McCandless, concisely nailing the peril and promise of this singular form of visual literacy, which requires the mastery of a special language to both create and consume intelligently:
The rise of the social web and our reluctance to read long documents has propelled the work of information designers like Neurath, Tufte and McCandless to the fore. It is boom time for infographics. Alongside other bite-sized shareable content such as photos of kittens and GIF animations, infographics have become a staple part of our media diet… Done badly, you get Chartjunk. Done well, they make data meaningful and entertaining. Sometimes even beautiful.
And, of course, what history of the web could be complete without
everyone’s favorite Graphics Interchange Format, or GIF (#18)? Boulton
offers a brief history surprisingly illuminating even for us smug,
GIF-slinging moderns:
It’s 20 years old. It supports only 256 colors. It’s unsuitable for photographs. It has no sound capability. It’s inferior to the PNG. Yet the GIF is still hanging in there. Why has it proved so tenacious? Because it can move.
CompuServe introduced the GIF format in the pre-Web days of 1987. It was released as a free and open specification for sharing color images across the network.
[...] The GIF really took off in 1993 with the release of Mosaic, the first graphical browser. Mosaic introduced the <img> tag, which supported two formats — GIF and a black-and-white format called XMB. Mosaic became Netscape and, as it grew, the GIF grew with it… In 1996, Netscape 2.0 was released. It supported GIF animations — multiple frames shown in succession. The Web went crazy.
But perhaps the most poignant section is also the most conceptual —
the notion of “digital fragility” (#41). Boulton captures it elegantly:
Printed in 1455, 48 copies of the Gutenberg Bible exist, yet not one copy of a website made a little over 20 years ago survives.
[...]
Digital content is so easy to duplicate that copies are not valued. Worse, the original version is also often considered disposable. Combine this with the rapid obsolescence of digital storage formats, and it is easy to see why many experts describe the early years of the Web as a digital dark age.
[...]
The last 20 years have seen the birth and rise of the Web at an astronomical pace. We have witnessed the birth of the Information Age, equal in magnitude to the transition to the modern world from the Middle Ages. We have a responsibility to expose this artistic, commercial and social digital history — the building blocks of modern culture — to future generations, an audience who will be unable to imagine a world without the Web.
Until we discover the digital equivalent of acid-free paper, bits and bytes remain extremely fragile.
But the story of the web is an optimistic one — and, more
importantly, one that is still being written. Not coincidentally, the
final idea in the book is the Semantic Web (#100) — the concept that, so
far, offers the greatest promise of helping us transmute information into wisdom, which is increasingly the defining challenge of our age. As Boulton puts it, “Knowledge is information in context.”
The term, perhaps unsurprisingly, comes from Tim Berners-Lee himself:
I have a dream for the Web … in which computers … become capable of analyzing all the data on the Web — the content, links, and transactions between people and computers. A “Semantic Web,” which makes this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade,bureaucracy and our daily lives will be handled by machines talking to machines.
The main value of the Semantic Web, however, is that it extracts
meaningful relationships and connections from large sets of information,
which brings us all the way back to Vannevar Bush’s ideal of “establishing useful trails through the enormous mass of the common record,”
and that it helps discern a context for isolated bits of information,
which is the foundation of knowledge and the very thing Paul Otlet
pursued in his vision of the Mundaneum. The web, it seems, is coming
full-circle.
100 Ideas that Changed the Web is wonderfully illuminating in its entirety. Complement it with Clive Thompson on how the web is changing the way we think for the better and a close look at just how revolutionary and influential Otlet’s Mundaneum was.
Article by Maria Popova, source: http://www.brainpickings.org/2014/