Living in the age of the search engine and the smartphone means that the answers to everyday questions are only a few taps on a touchscreen away. Unfortunately, a lot of those answers are wrong: That population figure might be out of date, that amazing tidbit about a congressional candidate might have been made up by his opponent.
We’re never going to be able to police the Internet to prevent people from putting up false facts, and it feels Orwellian to try. False information does have a cost, though. The interconnectedness of the Web, the cloud, and social media means that misinformation spreads as fast as information, and all sorts of studies have shown that if people see or hear a statement a few times, they’re far more likely to think it’s true than if they are exposed to it just once. Then they carry it around in their heads long after the original posts have been taken down or corrected.
Is there, instead, a way to help people better unlearn falsehoods? Studying how people are able (or unable) to correct wrong information is a whole research field in psychology. And one of its oddest findings is something called the hypercorrection effect. One would think it’s harder to convince people of the wrongness of things they really think are true than things they only sort of think are true. But it turns out to be the opposite. When a person learns that something they really thought was true is in fact false, they tend to remember the correct answer. But if they only halfheartedly believed the incorrect information to begin with, they’re more likely to forget the correct answer. The effect, discovered by the psychologist Janet Metcalfe a decade ago, has been replicated in several studies since.
In a way this makes sense. It’s surprising and memorable to learn that something one was sure of is totally wrong. An example that comes to mind is a “fact” I learned in elementary school: that nobody knew the world was round until Christopher Columbus sailed to America. Or try this one, from a recent hypercorrection study led by the psychologist Andrew Butler of Duke University: What class of animals contains the closest living relatives of the dinosaurs? Reptiles, right? Wrong: birds.
There are limits to the hypercorrection effect. Butler’s study, published last fall, found that while we are quicker to correct misinformation we were sure of, we’re also more likely to revert to those erroneous beliefs over time than those we had less of an attachment to. The reason, he suggests, is that—at least according to one theory of memory—there are two ways memories can be strong: storage strength and retrieval strength. Storage strength is how firmly entrenched a memory is; retrieval strength is how easy it is to mentally call it up. High-confidence errors (Columbus proved the earth is round) have high storage strength and high retrieval strength, because we’ve heard them lots of times. For a while after we hear the correct answer (sailors had known the earth was round since antiquity), that has a higher retrieval strength than the incorrect answer. But as time passes, the retrieval strength fades, and since the error still has a far greater storage strength because we’ve heard it so many more times, it reasserts its preeminence in our mind.
The trick then for how to dislodge incorrect information is to build up the storage strength of the correct answer—in other words, to repeat it lots of times. Of course, this assumes the person holding the factually incorrect belief is open to changing it. If someone is not interested in questioning the “fact” that the Apollo moon landing was faked, then he is going to keep believing that.
Still, the idea that the most firmly embraced factual errors are the ones most open to correction is encouraging. People who believe things that are clearly wrong aren’t beyond correction—in fact, the more adamantly wrong they are, the better.