The World Wide Web keeps expanding, emerging with various versions. And to think it all began with an ego bruise…
Unless you are living under a rock with no Wi-Fi signal, you would have heard of Web 2.0. In fact, one of the enduring clichés of our era is the suffixing of 2.0 to pretty much anything — governments that have websites do Governance 2.0; milkmen who take orders through mobile phones do Milk 2.0; or, m-Milk, where m stands for mobile. Back in the late 1990s, everyone was doing e-something — E-Commerce, E-Biz...
But jargon aside, 2.0 is quite a misleading suffix because there have been more versions of the Internet than two. To understand this better, we must go back to 1957, to a spherical Russian object that went by the friendly name Sputnik.
The very first man-made object to be launched into space caused a bit of a stir across the Iron Curtain. How could a bunch of underfunded Soviet scientists with thumb rules and callipers get ahead in the Space Race, the U.S. wondered. Out of that massive ego bruise was born the Advanced Research Projects Agency (ARPA). It was set up primarily to fund the most crazy, impractical ideas that no sane government would consider funding. The U.S. felt that bold leaps in technology came from crazy people in secret research labs.
Well, they weren't wrong. One of the first challenges they threw out was the problem of connecting computers. Computers, back in the late 1950s, were mostly several Godrej almirah-sized monstrosities with heaps of wires streaming out like spaghetti. Despite all these wires, they were mostly lonely savants; they couldn't speak to other computers. So, the very first version of the Internet was a bunch of U.S. Defence computers exchanging pleasantries and the occasional top-secret-missile-launch coordinates.
In fact, in networking jargon, when two devices acknowledge each other, it's called a ‘handshake'. This eventually grew to include several top U.S. universities, and ARPANet, as it was called those days, was essentially Web 1.0. It was all about connecting computers. In 1989, a British engineer named Tim Berners Lee felt we had to move beyond just connecting devices. He came up with a neat idea that raised the level of abstraction and allowed us to connect pages of content. Every time you click on a hyperlink in your browser today, the cosmos awards Berners Lee some karma. This eventually resulted in what we now know as the World Wide Web, a spectacular mess of links that only Google is able to make sense of. This, if one were historically accurate, is Web 2.0!
By the early 2000s, another shift happened. People were suddenly getting bored of links. We started asserting our social human selves and went on to waste untold amounts of time doing ‘social notworking'.
This is the Web of People, a smorgasbord of flattering photos, lunchtime choices, opinionated tweets, and utterly pointless memes. This is the era we now live in, the Web 3.0 era that is mistakenly called Web 2.0. In case you aren't confused yet, this is what the Web 2.0 folks (who are actually Web 3.0 folks) refer to as Web 3.0, which is, in reality, Web 4.0.
What's coming next is connected devices and location sensitivity. It's already there in bits and pieces. I know where my friends are, thanks to Foursquare and Google Latitude. When I walk into Mylai Karpagambal mess and my smartphone informs me my friend was here a while ago, and he thought the spinach vada was the greatest thing ever to be fried in a pan, it's Web 4.0.
There's only one thing that has survived every version of the Web. Spam. But that's my next column.
(A new weekly column about global technology and local knowledge)