Read The Long Tail Online

Authors: Chris Anderson

The Long Tail (8 page)

BOOK: The Long Tail
6.61Mb size Format: txt, pdf, ePub
ads

Today, millions of ordinary people have the tools and the role models to become amateur producers. Some of them will also have talent and vision. Because the means of production have spread so widely and to so many people, the talented and visionary ones, even if they’re just a small fraction of the total, are becoming a force to be reckoned with. Don’t be surprised if some of the most creative and influential work in the next few decades comes from this Pro-Am class of inspired hobbyists, not from the traditional sources in the commercial world. The effect of this shift means that the Long Tail will be populated at a pace never before seen.

THE WIKIPEDIA PHENOMENON

In January 2001, a wealthy options trader named Jimmy Wales set out to build a massive online encyclopedia in an entirely new way—by tapping the collective wisdom of millions of amateur experts, semi-experts, and just regular folks who thought they knew something. This encyclopedia would be freely available to anyone; and it would be created not by paid experts and editors, but by whoever wanted to contribute. Wales started with a few dozen prewritten articles and a software application called a Wiki (named for the Hawaiian word meaning “quick” or “fast”), which allows anybody with Web access to go to a site and edit, delete, or add to what’s there. The ambition: Nothing less than to construct a repository of knowledge to rival the ancient library of Alexandria.

This was, needless to say, controversial.

For one thing, this is not how encyclopedias are supposed to be made. From the beginning, compiling authoritative knowledge has been the job of scholars. It started with a few solo polymaths who dared to try the impossible. In ancient Greece, Aristotle single-handedly set out to record all the knowledge of his time. Four hundred years later, the Roman nobleman Pliny the Elder cranked out a thirty-seven-volume set of the day’s knowledge. The Chinese scholar Tu Yu wrote an encyclopedia on his own in the ninth century. And in the 1700s, Diderot and a few of his pals (including Voltaire and Rousseau)
took twenty-nine years to create the
Encyclopédie, ou Dictionnaire Raisonné des Sciences, des Arts et des Métiers.

Individual work gradually evolved into larger team efforts, especially after the arrival of the Industrial Revolution. In the late eighteenth century, several members of the Scottish Enlightenment started to apply the industrial principles of scientific management and the lessons of assembly lines to the creation of an encyclopedia such as the world had never before seen. The third edition of the
Encyclopædia Britannica,
published between 1788 and 1797, amounted to eighteen volumes plus a two-volume supplement, totaling over 16,000 pages. Groups of experts were recruited to write scholarly articles under the direction of a manager, organized by a detailed work chart.

Now Wales has introduced a third model: the open collective. Instead of one really smart guy or a number of handpicked smart guys, Wikipedia draws on tens of thousands of people of all sorts—ranging from real experts to interested bystanders—with a lot of volunteer curators adopting entries and keeping an eye on their progression. In Wales’s encyclopedia calculus, 50,000 self-selected Wikipedians equal one Pliny the Elder.

As writer Daniel H. Pink puts it, “Instead of clearly delineated lines of authority, Wikipedia depends on radical decentralization and self-organization; open source in its purest form. Most encyclopedias start to fossilize the moment they’re printed on a page. However, add Wiki software and some helping hands and you get something self-repairing and almost alive. A different production model creates a product that’s fluid, fast, fixable, and free.”

In 2001, that idea seemed preposterous. But by 2005, this nonprofit venture had became the largest encyclopedia on the planet. Today, Wikipedia offers more than 2 million articles in English—compared with
Britannica
’s 120,000 (65,000 in the print edition) and
Encarta
’s 60,000—fashioned by more than 75,000 contributors. Tack on the editions in more than 100 other languages, including Esperanto and Kurdish, and the total Wikipedia article count tops 5.3 million.

All you need to contribute to Wikipedia is Internet access: Every entry has an “Edit This Page” button on it, available to all. Each of us
is an expert in something, and the beauty of Wikipedia is that there is practically no subject so narrow that it can’t have an entry. This is in stark contrast to
Britannica
. If you open that great encyclopedia and find either no entry for what you’re looking for or an entry that seems deficient, there’s little you can do but shake your fist or write a letter to the editor (expecting no response). With Wikipedia, however, you fix it or create it yourself. This kind of shift from passive resentment to active participation makes the big difference. To remix the old joke about the weather, everybody complains about the encyclopedia, but now you
can
do something about it.

THE PROBABILISTIC AGE

Much is made of the fact that Wikipedia’s entries are “non-authoritative,” which is a way of saying they’re not invariably accurate. This is, of course, inevitable when anyone can write them. Unlike
Britannica
, where each entry is scrubbed, checked, and labored over by responsible professionals, each Wikipedia entry simply arrives, conjured from the vacuum by the miracle of the “Edit This Page” button.

In late 2005, John Seigenthaler Sr. wrote an op-ed in
USA Today
about his own Wikipedia entry; the entry started this way:

John Seigenthaler Sr. was the assistant to Attorney General Robert Kennedy in the early 1960’s. For a brief time, he was thought to have been directly involved in the Kennedy assassinations of both John, and his brother, Bobby. Nothing was ever proven.

Aside from the part about him being Robert Kennedy’s assistant in the 1960s, virtually everything else about the entry was false and slanderous. Seigenthaler called Wales and got him to delete the entry (although he could have easily done that himself), but after he wrote about the experience it led to a national debate over whether Wikipedia could be trusted, a question that continues today.

The answer is not a simple yes or no, because it is the nature of user-created content to be as messy and uncertain at the microscale,
which is the level at which we usually experience it, as it is amazingly successful at the big-picture macroscale. It just has to be understood for what it is.

Wikipedia, like Google and the collective wisdom of millions of blogs, operates on the alien logic of probabilistic statistics—a matter of likelihood rather than certainty. But our brains aren’t wired to think in terms of statistics and probability. We want to know whether an encyclopedia entry is right or wrong. We want to know that there’s a wise hand (ideally human) guiding Google’s results. We want to trust what we read.

When professionals—editors, academics, journalists—are running the show, we at least know that it’s someone’s job to look out for such things as accuracy. But now we’re depending more and more on systems where nobody’s in charge; the intelligence is simply “emergent,” which is to say that it appears to arise spontaneously from the number-crunching. These probabilistic systems aren’t perfect, but they are statistically optimized to excel over time and large numbers. They’re designed to “scale,” or improve with size. And a little slop at the microscale is the price of such efficiency at the macroscale.

But how can that be right when it feels so wrong?

There’s the rub. This tradeoff is just hard for people to wrap their heads around. There’s a reason why we’re still debating Darwin. And why
The Wisdom of Crowds
, James Surowiecki’s book on Adam Smith’s invisible hand and how the many can be smarter than the few, is still surprising (and still needs to be read) more than two hundred years after the great Scotsman’s death. Both market economics and evolution are probabilistic systems, which are simply counterintuitive to our mammalian brains. The fact that a few smart humans figured this out and used that insight to build the foundations of our modern economy, from the stock market to Google, is just evidence that our mental software (our collective knowledge) has evolved faster than our hardware (our neural wiring).

Probability-based systems are, to use writer Kevin Kelly’s term, “out of control.” His seminal book by that name looks at example after example, from democracy to bird-flocking, where order arises from what appears to be chaos, seemingly reversing entropy’s arrow. The
book is more than a dozen years old, and decades from now we’ll still find the insight surprising. But it’s right.

Is Wikipedia “authoritative”? Well, no. But what really is?
Britannica
is reviewed by a smaller group of reviewers with higher academic degrees on average. There are, to be sure, fewer (if any) total clunkers or fabrications than in Wikipedia. But it’s not infallible either; indeed a 2005 study by
Nature,
the scientific journal, reported that in forty-two entries on science topics there were an average of four errors per entry in Wikipedia and three in
Britannica
. And shortly after the report came out, the Wikipedia entries were corrected, while
Britannica
had to wait for its next reprinting.

Britannica
’s biggest errors are of omission, not commission. It is shallow in some categories and out of date in many others. And then there are the millions of entries that it simply doesn’t—and can’t, given its editorial process—have. But Wikipedia
can
scale itself to include those and many more. And it is updated constantly.

The advantage of probabilistic systems is that they benefit from the wisdom of the crowd and as a result can scale nicely both in breadth and depth. But because they do this by sacrificing absolute certainty on the microscale, you need to take any single result with a grain of salt. Wikipedia should be the first source of information, not the last. It should be a site for information exploration, not the definitive source of facts.

The same is true for blogs, no single one of which is authoritative. Blogs are a Long Tail, and it is always a mistake to generalize about the quality or nature of content in the Long Tail—it is, by definition, variable and diverse. But collectively blogs are proving more than an equal to mainstream media. You just need to read more than one of them before making up your own mind.

Likewise for Google, which seems both omniscient and inscrutable. It makes connections that you or I might not, because they emerge naturally from math on a scale we can’t comprehend. Google is arguably the first company to be born with the alien intelligence of the Web’s “massive-scale” statistics hardwired into its DNA. That’s why it’s so successful, and so seemingly unstoppable.

Author Paul Graham puts it like this:

The Web naturally has a certain grain, and Google is aligned with it. That’s why their success seems so effortless. They’re sailing with the wind, instead of sitting becalmed praying for a business model, like the print media, or trying to tack upwind by suing their customers, like Microsoft and the record labels. Google doesn’t try to force things to happen their way. They try to figure out what’s going to happen, and arrange to be standing there when it does.

The Web is the ultimate marketplace of ideas, governed by the laws of big numbers. That grain Graham sees is the weave of statistical mechanics, the only logic that such really large systems understand. Perhaps someday we will, too.

THE POWER OF PEER PRODUCTION

As a whole, Wikipedia is arguably the best encyclopedia in the world: bigger, more up-to-date, and in many cases deeper than even
Britannica
. But at the individual entry level, the quality varies. Along with articles of breathtaking scholarship and erudition, there are plenty of “stubs” (placeholder entries) and even autogenerated spam.

In the popular entries with many eyes watching, Wikipedia shows a remarkable resistance to vandalism and ideological battles. One study by IBM found that the mean repair time for damage in high-profile Wikipedia entries such as “Islam” is less than four minutes. This is not the work of the professional encyclopedia police. It is simply the emergent behavior of a Pro-Am swarm of self-appointed curators. Against all expectations, the system works brilliantly well. And as Wikipedia grows, this rapid self-repairing property will spread to more entries.

The point is not that every Wikipedia entry is probabilistic, but that the
entire encyclopedia
behaves probabilistically. Your odds of getting a substantive, up-to-date, and accurate entry for any given subject are excellent on Wikipedia, even if every individual entry isn’t excellent.

To put it another way, the quality range in
Britannica
goes from, say, 5 to 9, with an average of 7. Wikipedia goes from 0 to 10, with an
average of, say, 5. But given that Wikipedia has twenty times as many entries as
Britannica
, your chances of finding a reasonable entry on the topic you’re looking for are actually higher on Wikipedia.

What makes Wikipedia really extraordinary is that it improves over time, organically healing itself as if its huge and growing army of tenders were an immune system, ever vigilant and quick to respond to anything that threatens the organism. And like a biological system, it evolves, selecting for traits that help it stay one step ahead of the predators and pathogens in its ecosystem.

BOOK: The Long Tail
6.61Mb size Format: txt, pdf, ePub
ads

Other books

Reasonable Doubt 3 by Whitney Gracia Williams
The Opal Desert by Di Morrissey
An Indecent Longing by Stephanie Julian
Strange Recompense by Catherine Airlie
The Miner’s Girl by Maggie Hope
Restless Billionaire by Abby Green
Million-Dollar Horse by Bonnie Bryant