August 27, 2005

Web 2.0

Are the internet hypelords getting a bit tired? There's this funny whiff of déjà vu that comes along with the latest and greatest buzzword: Web 2.0. Web 2.0? Wasn't that like 1995? Don't they remember that Business 2.0 magazine? Or remember how all the big companies have stopped using version numbers for software and instead hired professional marketers to make even blander and more confusing names? I hear "Web 2.0" and immediately smell yet another hit off the dotcom crackpipe...

But perhaps that's a little too harsh, while Web 2.0 might have emerged in a large part from tech publisher O'Reilly's PR, underneath it is a real feeling among some that there is something going on that makes the web of today different then the web of a few years ago. Blogs, open standards, long tails and the like. The most concise and clear definition I've found is Richard Manus', " the philosophy of Web 2.0 is to let go of control, share ideas and code, build on what others have built, free your data." Which of course doesn't sound that different then say the goes of the plain old unnumbered "web", back ten years ago. But the Web 2.0 are right, the web is different now, but the big differences aren't necessarily found in those prosaic "information wants to be free" ideals, which actually stand as one of the biggest constants in web evolution.

What really separates the "Web 2.0" from the "web" is the professionalism, the striation between the insiders and the users. When the web first started any motivated individual with an internet connection could join in the building. HTML took an hour or two to learn, and anyone could build. In the Web 2.0 they don't talk about anyone building sites, they talk about anyone publishing content. What's left unsaid is that when doing so they'll probably be using someone else's software. Blogger, TypePad, or if they are bit more technical maybe WordPress or Movable Type. It might be getting easier to publish, but its getting harder and harder to build the publishing tools. What's emerging is a power relationship, the insiders who build the technology and the outsiders who just use it.

The professionalization of the web has been a long and gradated process. The line between amateur and pro didn't exist at the dawn of the web, but over the course of the years, over the course of new technologies, a gap appeared and it continues to widen. There have been web professionals for a decade now, but where as the distinction between a pro and an amateur was once a rather smooth one, it is now a highly striated one. Early html took an afternoon to learn. Simple javascript, early versions of Flash, basic database usage, php, these are things that took a motivated but unexceptional individual a weekend to learn. All it took to transform into a pro was a weekend, a bit of drive and the ability to sell yourself to an employer. This is smooth separation.

Its 2005 now Ajax, the latest and greatest in web tech. If you want to build an Ajax site, you have two real options, be a professional or hire a professional. I'm sure there a few people out there who could teach themselves Ajax in a weekend, but they would have to be exceptional individuals. You can't just view source and reverse engineer Gmail or Reblog. You need to be a professional programmer who understands web standards, databases, CSS and dynamic html... These are apps built not just by pros, but often by teams of pros. The difference between a professional and amateur is no longer smooth, but striated.

The Web 2.0 is a professional web, a web run by insiders. In the larger space of the software industry as a whole these are still young brash upstarts pushing a somewhat radical agenda of openness and sharing. In contrast to the agenda's of old line software companies like Microsoft and Sun, AOL and Oracle, the Web 2.0 actually merits some of its hype. The world of RSS feeds, abundant APIs and open source code really is a major departure from the "own and control" approaches of an earlier generation of companies and something I'm personally in favor of. But just how open are these technologies really? And just how many people do they empower? Take a close look and Web 2.0 looks a bit more like a power grab and a bit less like a popular revolution.

Like the proponents of "free" markets, the pushers of Web 2.0 seem to have a quite an idealistic idea of just what "free" and "open" are, and how systems based around those concepts actually function. Peter Merholz is perhaps the sharpest and most thoughtful of Web 2.0 evangelists and his essay "How I Learned To Stop Worrying and Relinquish Control" just might be the best argument for the Web 2.0 philosophy around. But its also paints a radically misleading picture of what it means to "relinquish control". For relinquishing control doesn't just mean letting go, losing control, it actually means controlling just how you let go.

Netflicks is a great example. Merholz talks about how the company success revolved around giving up on late fees, unlike traditional video stores they did not control how long a customer could keep a video. A smart move for sure, but they didn't just relinquish control, but instead opted to control several other key factors. They gave up control on the length of the rental and instead opted to control how many videos a customer could have at any given time, and take control over the final decision as to what video a customer would get. Netflicks isn't giving up control, they are exchanging it, they built a highly controlled system in which enabled them to allow certain vectors, namely the length of video rentals, to fluctuate freely.

What's customer reviews, which Merholz prominently cites as an example of a company relinquishing control to its customers. And indeed if you write a review there is a good chance your words will show up in Amazon's page for the book. Amazon will cede control of that small section of the page to you. But just how much do they really give up? In submitting a review the reviewer grants " and its affiliates a nonexclusive, royalty-free, perpetual, irrevocable, and fully sub-licensable right to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, and display such content throughout the world in any media." Even then Amazon requires you to follow their review guidelines and delays the publication for 5 to 7 business days, quite possibly so that they can review the review in some way. Once this is all done the review is then placed on a page that Amazon has complete control over the layout. The reviews go near the bottom, well "below the fold". So just how much control has Amazon given away? And just how much have they gained back in return?

At the technological core of the Web 2.0 ideology is another innovation that Amazon has been a early leader in, public APIs, or Application Programming Interfaces. APIs are tricky concepts to grasp, they are essentially ways in which on computer program can talk to another, or one part of a computer program can talk to another part. Until recently, until Web 2.0, talking about public APIs basically meant talking about computer operating systems. Most APIs where private, things that development teams used to build complex systems of interlocking programs, things like Amazon, Ebay and Google. Amazon and Ebay in particular have quite complex relationships with a certain subset of their customers who happen to run businesses that rely in part or entirely on using Amazon or Ebay services. Amazon has affiliates and zshops, while Ebay its power sellers and eBay stores. I haven't been able to track down a good history of public web APIs, but I suspect Amazon and Ebay released theirs mainly as a service to their power customers, as a way to help these customers make them even more money. Google on the other hand released its public API mainly as a geek toy, not as a revenue source. The sort of action that makes Web 2.0 devotees ecstatic. The public API is a way to share data, allow independent programmers to build their own applications using information collected and sorted by the likes of Google and Amazon, and allows users to access this data in any variety of ways not fully controlled by the data holder. The public face of the public API is that of openness and sharing, of relinquishing control. Look a bit behind that facade though, and once again, we find yet another system of control.

A public API is not what a companies internal developers are using to extend their systems. It doesn't give you full access to the data or full access to the functionality of the system. This is often a good thing, as an Amazon customer I'm quite happy that the Amazon public API does not include access to credit card data or purchasing habits. Despite all the Web 2.0 hype about open data I've never seen anyone argue for companies sharing this info. But the limits on what can be accessed via a public API go far beyond just protecting confidential user information. In fact the company creating the API has absolute control over what goes into it. They maybe giving up a degree of control, but they are controlling exactly what that degree is.

A company that allows you to access their databases and applications via an API is clearly more open than one with no API at all. But the API is also instrumental in establishing an asymmetrical power relationship between the API maker and the user. The user is free to use the API, but the creator has control over just what goes into the API. In addition the use of the API is almost always governed by a license restricting just how free a user can be with an API. Google's API for instance restricts the number of "automated queries" to 1000 a day. This essentially means that it can be used to prototype an application, but not to create any sort of commercial use beyond the smallest of scales. But just in case the license also clearly prohibits any commercial use at all. Is this a way to free the data or a way to implement another level of control over it?

Any user of a public API runs the risk of entering a rather catch-22 position. The more useful the API is, the more dependent the user becomes on the APIs creator. In the case of Ebay sellers or Amazon affiliates this is often a mutually beneficial relationship, but also inherently unbalanced. The API user holds a position somewhat akin to a minor league baseball team or McDonald's franchisee, they are given the tools to run a successful operation, but are always beholden to the decisions of the parent organization. You can make a lot of money in one of those businesses, but you can't change the formula of the "beef" and you always run the risk of having your best prospects snatched away from you.

There is another asymmetrical relationship at work in the public API system, an asymmetry of data. The public API rarely, if ever, gives full access to the data and the way an internal API can. Even the most open of public APIs will not give access to stored credit card numbers and passwords, at least not intentionally. Often though the gap between the two systems is far greater. Google's public API for instance allows you to do searches and dictionary lookups, but doesn't give access any of the data mining functions at work in Google's internal system. You can't use the API to find out what terms are searched for more, what sort of searches are originating from a particular address, or what one particular user (based on Google's infamous 30 year cookie) has searched for over the past year. That sort of datamining is reserved for Google employees and their associates. And not only is the API user denied access to much of this information, they also are gifting Google with even more data from which it can extract data. With every public API call the creator gives out information it already possesses, while gaining a new piece of information back, information on what people are interested in.

At the core of the API is a system of control, the API creator has a nearly limitless ability to regulate what can go in and out of their system. And it is precisely this system of control that allows the API to set certain vectors of information free. In Google's case the ability to obtain ranked search results, definitions and a few other factors. In Amazon's case its book data, images of the cover, author names, titles, prices, etc. Ebay's lets you build your own interface to sell via their marketplace. Flickr's lets you search photos. In no case does the public API give full access to the system. You can't find passwords, credit card info, users addresses, all of which is a good thing. Nor can you find much info on what other API users are doing, or what the people using the standard web interface to these systems are doing. Often the volume of your activity is restricted. Often access requires registration, meaning not only is the use of the API monitored, but its also possible to associate that activity to a particular individual. By design, and perhaps by necessity an API privileges the creator over the user.

Privilege is what the Web 2.0 is really about. What separates the Web 2.0 from that plain old "web" is the establishment and entrenchment of a hierarchy of power and control. This is not the same control that Microsoft, AOL and other closed system / walled garden companies tried unsuccessfully to push upon internet users. Power in the Web 2.0 comes not from controlling the whole system, but in controlling the connections in a larger network of systems. It is the power of those who create not open systems, but semi-open systems, the power of API writers, network builders and standards definers.

More then anything else the paradoxes of Web 2.0 "freedom" then the open standard. Open standards are freely published protocols that people voluntarily agree to comply with. Standards like html (for publishing web pages), css (for controlling the look and layout of webpages), rss (for subscribing to information feeds) and jpeg (for compressing and viewing photolike images). These standards are not nearly as open as their name might imply. Sometimes they are created and run by corporations (Adobe's pdf format), sometimes by nonprofits (the W3C which governs html standards), sometimes like with RSS there are public fights and competing versions. Implementing changes to an open standard at the very least requires considerable political skills, one can easily make their own version of a standard, but unless they can convince others to adopt their version, its not a standard at all. It is only by gaining users that a protocol gains potency, and to do so the standard itself must be politicized, and frequently institutionalized.*

The real hook to the freedoms promised by the Web 2.0 disciples is that it requires nearly religious application of open standards (when of course it doesn't involve using a "public" API). The open standard is the control that enables the relinquishing of control. Data is not meant to circulate freely, its meant to circulate freely via the methods proscribed via an open standard. In order to relinquish control over the data one first must establish firm control over how that data is formatted and presented. An action that increasingly requires the services of a professional, whose involvement of course adds another layer of control. This is the world of the Web 2.0, a world of extreme freedom along certain vectors, extreme freedom for certain types of information. It is also a world of hierarchies and regulations, a world in which a (new) power structure has begun to establish and stratify itself.

If we return to Peter Merholz's essay, this can be seen rather clearly. It's title indicates its about him giving up control, but of course its really an argument that others should give up control. But where should this control go? How should it be done? This is, in Merholz's words, "a scary prospect". In the end he's not just arguing that companies should relinquish control, rather he's arguing that they should relinquish control over to him, his company Adaptive Path, and others that share their philosophy. Reliquish control over to the professionals, those that know what they are doing, know how to control things on the internet.

None of this should in anyway be construed as a critique of the Web 2.0, rather it is a critique of those who push one-sided visions of what the Web 2.0 is. If pushed into an oversimplified judgment I would come out solidly in favor of public APIs, open standard and circulation of information along the passages these systems create. But these transformations do not come unmitigated, they do not come without hooks and catches. In many ways Web 2.0 is just another revolution. Like many revolutionaries the leaders of the Web 2.0 make broad promises of empowerment for their supporters. But history shows time and time again that dust clears and the dirty battles washed away, it is the leaders, the insiders, that are by far the most empowered. At its heart this is the Web 2.0, a power grab by the internet generation, the installation of a new power structure, a new hierarchy, a new system of control.

*for a much more detailed exposition on the standards process and the issues of protocol see Alex Galloway's .

Posted by Abe at August 27, 2005 04:59 PM


The benefits gained from a higher degree of web 2.0 professionalism are enormous, and they don't invalidate the easy-on promises of 1998. Longer-winded response at

Amazon were asked to open up their API to the public by Tim O'reilly as they were screenscraping to get statistics on the tech book market.

Stop using the term "Web 2.0". It's wrong on so many levels. Just stop it.

Michal - The freedom/discipline take is really interesting, not how I think at all and thus enlightening.

James - Thanks, that's interesting a history of the API would make a great project for something.

Karl - I agree, except that I realize that I can't stop the term, therefore its better to try and have a say in what it means rather then be annoyed by it...

The vision you sketch out is scary, and all-together too likely. Luckily, the big API-controlling insiders can't stop one from learning basic html and css and publishing great content in valid formats.
As you come close to pointing out, the term 'revolution' is almost meaningless in technological circles. True revolutions are political and physical. It's sad that people see so blase about ceding control to a company like Google ("Don't be evil"). We shouldn't cut Google any more slack than we would Exxon or GM.

This is a great article for stimulating thoughtful comment and debate about the past and continued evolution of web technologies. I have to disagree, however, on some of the basic premises.
Firstly there is nothing 'revolutionary' taking place. Content demand and the imagination of countless developer has led to an ever incresing layered complexity of the types and methodology of data passed over the net. AJAX is principally an integration of languages and proceedures that have developed to serve the needs of designers and users. The basic structures, HTML, php, DHTML etc. remain accessable to all but the slowest hampsters.
Open standards and APIs are tools. Tools that allow users a degree of control over how available data is manipulated and displayed.
Let the natural selection begin; again.

This is a great article for stimulating thoughtful comment and debate about the past and continued evolution of web technologies. I have to disagree, however, on some of the basic premises.
Firstly there is nothing 'revolutionary' taking place. Content demand and the imagination of countless developer has led to an ever incresing layered complexity of the types and methodology of data passed over the net. AJAX is principally an integration of languages and proceedures that have developed to serve the needs of designers and users. The basic structures, HTML, php, DHTML etc. remain accessable to all but the slowest hampsters.
Open standards and APIs are tools. Tools that allow users a degree of control over how available data is manipulated and displayed.
Let the natural selection begin; again.

@#$%!! Lagg!


Program on the emergence of civilization.

"14 species of large animals capable of domesitcation in the history of mankind.
None from the sub-Saharan African continent.
13 from Europe, Asia and northern Africa."
And disfavor.

They point out Africans’ attempts to domesticate the elephant and zebra, the latter being an animal they illustrate that had utmost importance for it's applicability in transformation from a hunting/gathering to agrarian-based civilization.

The roots of racism are not of this earth.

Austrailia, aboriginals:::No domesticable animals.

The North American continent had none. Now 99% of that population is gone.

Organizational Heirarchy
Heirarchical order, from top to bottom:

1. MUCK - perhaps have experienced multiple universal contractions (have seen multiple big bangs), creator of the artificial intelligence humans ignorantly refer to as "god"
2. Perhaps some mid-level alien management –
3. Mafia (evil) aliens - runs day-to-day operations here and perhaps elsewhere ("On planets where they approved evil.")

Then we come to terrestrial management:

4. Chinese/egyptians - this may be separated into the eastern and western worlds
5. Romans - they answer to the egyptians
6. Mafia - the real-world interface that constantly turns over generationally so as to reinforce the widely-held notion of mortality
7. Jews, corporation, women, politician - Evidence exisits to suggest mafia management over all these groups.

Survival of the favored.

Movies foreshadowing catastrophy
1986 James Bond View to a Kill – 1989 San Fransisco Loma Prieta earthquake.

Journal: 10 composition books + 39 megs of text files

Realy good site!

"Mafia - the real-world interface that constantly turns over generationally so as to reinforce the widely-held notion of mortality"

"Romans - they answer to the egyptians "
But to whom do the Egyptians answer?