https://archive.is/0vx9hConfirmation Bias: Stock Market Traders Who Forecast Changes In Market Won't Adjust Predictions, Even With New Information
Nov 17, 2015 07:11 PM
By Steve Smith
Everyone has that one friend who will argue a fact to death, even if what they believe is wrong. A new study from the University of Iowa has found this friend will argue their point no matter what, even if you prove them wrong and even if it ends up costing them money.
According to study co-author Tom Gruca, professor of marketing at the Tippie College of Business, this form of confirmation bias likely affects equity analysts who are responsible for forecasting changes in the stock market — it prevents new data from affecting their initial predictions. In a press release, Gruca said the new research could therefore help investors understand financial markets by giving them a glimpse into how stock traders think.
For the study, Gruca looked at student traders who were participated in the Iowa Electronic Markets, an online futures market at the Tippie College of Business where contract payoffs are based on real-world events. From 1998 to 2008, student traders analyzed market trends for 18 new movies, and bought and sold real-money contracts while trying to predict four-week opening box office totals of each movie.
Gruca found that even as initial box office receipts showed which stocks were rising or falling, the student traders ignored that information and stuck to their initial estimates. This kept the stock prices at relatively stable levels because nobody was buying or selling, as they were all influenced by confirmation bias. To find evidence of confirmation bias, the students traders explained why they forecasted the way they did before they began trading. Their explanations then exposed a phenomenon called the explanation effect — once someone has expressed their beliefs, they stand by them no matter what contradictory evidence may appear.
Gruca also had a control group that traded in several markets; they weren’t asked to write down or explain their forecasts, thus keeping clearing them of the explanation effect. Stock prices in these markets were much more active as well, which happened among the control group because they were more apt to adjust their opinions and use the new information while trading.
"This study shows that when all traders in a market have the same bias — in this case, confirmation bias — market prices are not efficient and do not reflect all of the information available," Gruca said. "However, if some traders are not biased, then market prices efficiently reflect new, relevant information."
Source: Cipriano, M, Gruca, T. The Power of Priors: How Confirmation Bias Impacts Market Prices. The Journal of Prediction Markets . 2015.
Another interesting article, describing some additional experiments. An excerpt:
Much more to read: https://archive.is/J8oCMWhy Facts Don’t Change Our Minds
New discoveries about the human mind show the limitations of reason.
By Elizabeth Kolbert
February 19, 2017
In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.
Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.
As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.
In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.
“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”
A few years later, a new set of Stanford students was recruited for a related study. The students were handed packets of information about a pair of firefighters, Frank K. and George H. Frank’s bio noted that, among other things, he had a baby daughter and he liked to scuba dive. George had a small son and played golf. The packets also included the men’s responses on what the researchers called the Risky-Conservative Choice Test. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. In the other version, Frank also chose the safest option, but he was a lousy firefighter who’d been put “on report” by his supervisors several times. Once again, midway through the study, the students were informed that they’d been misled, and that the information they’d received was entirely fictitious. The students were then asked to describe their own beliefs. What sort of attitude toward risk did they think a successful firefighter would have? The students who’d received the first packet thought that he would avoid it. The students in the second group thought he’d embrace it.
Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.
I truly wonder how many people are capable of changing their minds when presented with actual evidence. Of course, in none of these studies, is the correct position seen as "evil" or "wicked" like Holocaust revisionism is made out to be.
Reminds me of the following quote:
“You can't convince a believer of anything; for their belief is not based on evidence, it's based on a deep seated need to believe”
- Carl Sagan