Number of Coins in a Jar Discrete or Continuous

steiner-histogram-660

A few weeks ago, I asked the internet to guess how many coins were in a huge jar (below). For more than 27 years, my parents had saved their spare change. My mother recently trucked the whole load to a bank to cash in, and in so doing finally learned the stockpile's actual value, or at least the value as calculated by that particular coin-counting machine. The update from Mom got me wondering: Might someone be able to guess that amount? What about our collective estimate---is the crowd really as wise as some say it is?

The mathematical theory behind this kind of estimation game is apparently sound. That is, the mean of all the estimates will be uncannily close to the actual value, every time. James Surowiecki's best-selling book, Wisdom of the Crowd, banks on this principle, and details several striking anecdotes of crowd accuracy. The most famous is a 1906 competition in Plymouth, England to guess the weight of an ox. As reported by Sir Francis Galton in a letter to Nature, no one guessed the actual weight of the ox, but the average of all 787 submitted guesses was exactly the beast's actual weight.

Galton, who also happens to be the inventor of eugenics, was shocked to find such value in "democratic judgment."

The notion that the hive is more intelligent than the individuals comprising it is a seductive one, and a keystone of today's bottom-up Big Data revolution. It's democratic ideology, open-source goodness, the "invisible hand," and New Age humility all wrapped into a big networked hug.

The Steiners' coin jar.

Susie Steiner

But is it true? The results of the coin jar experiment offer some clues. I won't bore you with the finer points of asymmetric non-parametric one-sample T tests, but let's put it this way: The crowd was waaaay off.

The actual value of the coins was $379.54. The mean value (x̅) of the 602 guesses submitted was $596.12, about 57 percent too high. What's more, a massive (by crowd-wisdom standards) 40 percent of the individual guesses were closer to the actual value than that of the crowd.

After looking at the histogram above, keen observers will notice that the data are highly skewed. In these cases, how do you summarize what the crowd thinks?

The mean, or average, is still considered the gold standard for aggregating crowd wisdom, although Francis Galton himself recommended using the "vox populi"—his term for median, or the middle-most guess where half of the other guesses are higher and half are lower. In our case, a weighted mean might be the most appropriate value, giving more influence to guessers who reported having "actually done the math."

The problem is, people who claim to have done some math were far less accurate (x̅ =$724.81) than those who made a snap judgment (x̅ =$525.02). This may explain why estimates submitted from .edu or gmail addresses were less accurate than guesses submitted from hotmail and yahoo addresses. (Here are the data.)

histogram copy

So what happened to the collective intelligence supposedly buried in our disparate ignorance?

Most successful crowdsourcing projects are essentially the sum of many small parts: efficiently harvested resources (information, effort, money) courtesy of a large group of contributors. Think Wikipedia, Google search results, Amazon's Mechanical Turk, and KickStarter.

But a sum of parts does not wisdom make. When we try to produce collective intelligence, things get messy. Whether we are predicting the outcome of an election, betting on sporting contests, or estimating the value of coins in a jar, the crowd's take is vulnerable to at least three major factors: skill, diversity, and independence.

A certain amount of skill or knowledge in the crowd is obviously required, while crowd diversity expands the number of possible solutions or strategies. Participant independence is important because it preserves the value of individual contributors, which is another way of saying that if everyone copies their neighbor's guess, the data are doomed.

Failure to meet any one of these conditions can lead to wildly inaccurate answers, information echo, or herd-like behavior. (There is more than a little irony with the herding hazard: The internet makes it possible to measure crowd wisdom and maybe put it to use. Yet because people tend to base their opinions on the opinions of others, the internet ends up amplifying the social conformity effect, thereby preventing an accurate picture of what the crowd actually thinks.)

What's more, even when these conditions—skill, diversity, independence—are reasonably satisfied, as they were in the coin jar experiment, humans exhibit a whole host of other cognitive biases and irrational thinking that can impede crowd wisdom. True, some bias can be positive; all that Gladwellian snap-judgment stuff. But most biases aren't so helpful, and can too easily lead us to ignore evidence, overestimate probabilities, and see patterns where there are none. These biases are not vanquished simply by expanding sample size. On the contrary, they get magnified.

Given the last 60 years of research in cognitive psychology, I submit that Galton's results with the ox weight data were outrageously lucky, and that the same is true of other instances of seemingly perfect "bean jar"-styled experiments. There will always be predictable systematic bias in these types of tasks, whether it is due to the type of item being guessed, the size or shape of the container, the public or private nature of the guesses, whether the guessing is part of a contest, or whether the guess is made from a photograph or in person.

All of this is not to suggest that we aren't onto something potentially massive and powerful in the new era of analytics, citizen science, and non-hierarchical organizational design. But the future of our collective wisdom depends on better understanding of the biases that affect individual cognition and the dynamics of the crowd, so that we may better capitalize on both.

In the meantime, against all logic, my parents have started slowly filling up a new coin jar. In another 27 years, will our collective guess be any closer?

David Wolman, author of the new ebook Firsthand, contributed to this report.

fogartytanduch.blogspot.com

Source: https://www.wired.com/2015/01/coin-jar-crowd-wisdom-experiment-results/

0 Response to "Number of Coins in a Jar Discrete or Continuous"

Enregistrer un commentaire

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel