You know the old joke: a man goes to the doctor and is told he only has a month to live.
“Surely not!” he gasps. “I want a second opinion!”
“Alright then,” says the doctor. “You’re hideously ugly, too.”
The misunderstanding arises because the doctor is arrogant enough to think her patient trusts her as an expert on multiple issues, when the patient was, in fact, worried about error. The doctor might have seen a positive test result for a killer disease and taken it at face value, without considering that the disease is vanishingly rare, so the test result was likely a false positive. That is, the patient might be concerned that the doctor’s judgement — because of her failure to consider the “base rate” of the disease — might have been subject to bias. In other words, skewed in a specific direction.
Alternatively, maybe the patient was concerned that the doctor had carelessly misread the test results, or even read those of a different patient. Another doctor, even one of similar skill, would be unlikely to make the exact same mistake, hence the request for a second opinion. So rather than bias, the patient might have been worried about noise: the tendency for human judgments to vary in unwanted, unpredictable and arbitrary ways.
The first type of error, bias, is well-known, thanks to the work of Daniel Kahneman, who is among the most famous psychologists in the world. As he chronicled in his mega-blockbuster popular-science book Thinking, Fast and Slow, Kahneman spent decades with his colleague Amos Tversky cataloguing all the ways human thinking can go off the rails: not just the “base rate neglect” that we saw above, but all sorts of other biases. These include “anchoring” — best explained by the sales move where a shop gives an item a super-high price and then gives you 50% off, though 50% of the price is higher than you’d have paid for it if you’d never seen that initial value. There’s also “framing”, where asking a question in different ways can affect people’s answers (would you choose to have surgery that has a “10% death rate”? What if I told you it had a “90% survival rate”?). For these and many other contributions, Kahneman remains the only psychologist ever to have won a Nobel Prize, in 2002.
There was, however, a certain irony in Thinking, Fast and Slow. Whereas the biases and heuristics that Kahneman identified have been borne out extremely well by subsequent studies, a good chunk of the rest of the book, where Kahneman talked about other scientists’ work, hasn’t. For instance, Kahneman devotes a chapter to a certain kind of social psychology study where barely noticeable “priming” stimuli are shown to participants in lab studies, with the intention of changing their behaviour. For example, one set of researchers claimed that showing people a screensaver with banknotes on it made them less likely to want to help a struggling student — because it “primed” the idea of money, and thus selfishness, in their minds.
Long story short: those studies were weak, and other scientists can’t find similar results when they try to re-run the experiments. There’s plenty of evidence for priming in language — people react faster when asked to decide which of “CHAIR” and “CHIAR” is a real word if they’ve just seen the word “TABLE”, compared to if they’ve just seen a word unrelated to furniture. But the type of priming study where a barely noticeable prime makes major, measurable changes to people’s subsequent actions? Not so much. And yet, here’s how Kahneman, in Thinking, Fast and Slow, summarised his views on that kind of priming research:
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
Subscribe