Working Hypothesis: How to Avoid Fooling Yourself

Ellie Arroway is a scientist searching for extraterrestrial life. She’s listening for radio signals from outer space in the New Mexico desert waiting for E.T. to phone from home.

A loud signal rises above the cosmic noise and jolts her awake. “Holy shit,” she blurts, and speeds wildly to her office. Once back at the office, she yells at her colleagues, “Make me a liar!” It’s an invitation to prove her wrong.

The fellow scientists come up with various alternative hypotheses about the source of the signal. It could be AWACS. But the status of AWACS is negative, so that’s ruled out. Other possible sources are also checked off. NORAD’s not tracking any snoops in this vector. The Space Shuttle Endeavor is in sleep mode. Arroway also checks FUDD to confirm that the signal came from space and not from Earth.

The point source of the signal is later determined. It’s a star called Vega. But instead of settling on the answer, the team immediately moves to prove this hypothesis wrong: Vega is too close, it’s too young to have developed intelligent life, and it has been scanned a bunch of times before with negative results.

But the unmistakable signal is a sequence of prime numbers—a clear sign of intelligence. But it could be a spoof, a glitch, a delusion—any number of things could have led her team astray. She knows that the discovery has to be independently confirmed.

She dials a colleague in Parks Observatory, which hosts a radio telescope in New South Wales, Australia. The Aussie colleague confirms the signal.

“Do you have a source location yet?” Arroway asks, without revealing her own findings.

“We put it right smack in the middle,” the Australian replies. After a brief pause he adds, “Vega.”

This is a scene from the movie Contact which is based on a novel authored by Carl Sagan. Ellie Arroway is the name of the character played by Jodie Foster.

Although fictional, this story presents an important process followed by scientists that all of us ought to emulate: how not to fool yourself.

The first thing to note is what Arroway does not do. Even when she hears a distinct signal that appears to be a sign of intelligent life, she refrains from immediately blurting out an opinion about what the signal might mean. But in reality, we do the opposite. We accept the first thing that confirms our beliefs as truth. We jump from “This sounds right to me” to “This is true” in no time.

As much as we would like to believe, facts don’t change our minds. The mind is stubborn, and no matter how strong the facts are, we undervalue evidence that contradicts our beliefs and overvalue evidence that confirms them. Ironically, the same brain that empowers rational thinking also skews our judgments.

In a study of over two hundred people, two-thirds of the participants refused the opportunity to win extra money by listening to the other side’s arguments on same-sex marriage. They didn’t turn down the money because they already knew what the other side thought or because they knew that they were right. The participants explained that hearing the opposing views would be too frustrating and discomforting to them. The case was same for participants on both sides of the argument.

Being certain about our views feels good, and we get a dopamine hit every time we read a confirming fact. In contrast, hearing opposing views is genuinely unpleasant, so much so that people are ready to turn down cash just to stay in their bubble.

Forming an educated opinion takes time and effort that we never put. As a result we form opinions with zero research. This presents a serious problem.

Once we form an opinion, we fall in love with it. As we flaunt it around, it becomes part of our identity. We become a liberal or a conservative, a paleo or a vegan, a pro-life or a pro-choice person, a neo-liberal or a pro-regulation.

This is when changing our opinion becomes hard because it calls for changing our identity. This is when dissent turns into an existential death match. After a point, we simply shutoff our system, and stop listening to counter-arguments altogether, even when money is offered.

This happens because our opinions weren’t ours to begin with. They were other people’s ideas we borrowed from some video, article, debate, comment, etc. Since we didn’t do the work, they didn’t stand a chance in front of the first set of counter-arguments.

While there’s no shortcut to forming opinions, a good trick to refrain from flaunting it while you are working on it is to refer it as a Working Hypothesis.

Working means it’s a work in progress. Working means it’s less than final. Working means the hypothesis can be changed or abandoned, depending on the facts.

Opinions are defended, but working hypotheses are tested. The test is performed, not for the sake of the hypothesis, but for the sake of facts. Some hypotheses mature into theories, but many others don’t. At any point, it’s better to reject a wrong hypothesis after rigorous research than carry it around by ignoring the facts.

As Ozan Varol writes, “No one comes equipped with a critical-thinking chip that diminishes the human tendency to let personal beliefs distort the facts. Regardless of our intelligence, Feynman’s adage holds true: The first principle is that you must not fool yourself—and you are the easiest person to fool.”

Show Comments