Strong Opinions Weakly Held

We never have all the information to make informed decisions. So, it’s better to make a provisional decision (based on what we know so far) instead of waiting around for the correct information to present itself. This is the first step.

The second step is to gather data to disprove this hypothesis and avoid confirmation bias. While this framework may work in personal decisions, it may not work so well in groups. Following this principle requires strong personal discipline, but when group dynamics come into play, other members may not be as diligent as an individual. In the subsequent paragraphs I discuss couple of ways we can attack this problem.

In the mid-80s, futurist and entrepreneur Paul Saffo developed a mantra that spread through Silicon Valley in numerous company cultures: strong opinions weakly held. The saying was intended to combat a fairly common problem in startups: the indecision and paralysis that comes from uncertainty and ambiguity.

“I have found that the fastest way to an effective forecast is often through a sequence of lousy forecasts. Instead of withholding judgment until an exhaustive search for data is complete, I will force myself to make a tentative forecast based on the information available, and then systematically tear it apart, using the insights gained to guide my search for further indicators and information. Iterate the process a few times, and it is surprising how quickly one can get to a useful forecast,” Saffo wrote.

In other words, we should allow our judgement to come to a stopgap conclusion — no matter how imperfect. This is the “strong opinion” part. But we should also engage in creative doubt, think in bets, and look for indicators that may point towards a different direction. We should let our intuition guide us towards a different conclusion if the new evidence says so. This is the “weakly held” part. According to Saffo, if we do this enough times, we’ll be able to reach a useful result through a sequence of faulty conclusions.

So far, so good. But a problem arises when the “boss” has a strong opinion. No matter how weakly held their opinion is, it needs an inordinate amount of conviction and gumption to challenge that.

But, this is solvable. As long as there are good protocols to challenge somebody’s opinion in a civilised manner, this is achievable. The real problem is when this principle is hijacked for the convenience of bullshitters — people who like to have opinions loudly over and over without any acknowledgement that their statements are unsubstantiated, or possibly even damaging. In other words, the “trumps”.

For example, the biggest trump of all, the great Donald Trump “strongly” held the opinion that Barack Obama wasn’t born in America, that the legitimacy of his presidency was a sham. This forced the then Obama govt. to procure his birth certificate from Hawaii and put it up for the public to see. Trump didn’t back down until then.

You see, the problem is that trumps have a tendency to turn strong opinions weakly held into strong opinions assumed to be true until you prove otherwise. They hold a strong opinion (based on nothing) but they put the onus of proving them wrong upon others. Instead of using this principle, they abuse it.

Donald Trump maybe the most obvious (and obnoxious) case, but trumps are lurking everywhere — in teams, meetings, discussions, etc. They may not be full time trumps. They aren’t even evil. They are just self-serving folks who want to sway the group towards their own agenda. Most of the time, they aren’t even doing it consciously. I mean, let’s face it, nobody would like to have a strong opinion only to have it debased by somebody else. It’s against human nature.

There’s always a shrewd person in a meeting who can state their case with absolute certainty and shut down further discussion. Others either assume that this person knows best, or don’t want to stick their neck out and risk criticism. This is especially true if the person in question is senior in hierarchy, or if there is some sort of power differential in the group.

We can’t do much about the trumps in politics. In their defence, they have to have strong opinions strongly held to get the support of the crowd. If a leader says their chances of curbing a virus is only 80%, they may not win the next election.

But in a much smaller setting — such as in a group — there are certain measures we can take to design a process that makes sure voices are heard and strong opinions are weakly held for real. And the first step in the process is to know the strength of one’s opinion.

But when asked how sure is somebody about something the late and great Amos Tversky used to joke that the human brain falls back to “yes I believe that, no I don’t believe that, and maybe” — a simple three-dial setting by default. But sound decision-making — especially in groups — needs more nuance, and for lack of better word, more dials.

Let me illustrate this with an example to explain the gravity of the problem. In the late 40s, the Communist government of Yugoslavia broke from the Soviet Union, raising fears the Soviets would invade them. The US released a report (the National Intelligence Estimate or NIE) based on their analysis. “Although it is impossible to determine which course of action the Kremlin is likely to adopt,” it concluded, “we believe that the extent of Eastern European military and propaganda preparations indicates that an attack on Yugoslavia in 1951 should be considered a serious possibility.”

By most standards, that is clear, meaningful language. A “serious possibility” indicates a strong likelihood of the attack. No one suggested otherwise when the report was read by top officials in the US.

But it was found later that by “serious possibility” the report actually suggested that “the odds were about 65 to 35 in favour of an attack.” Interestingly, the analysts who wrote the report had varying interpretations of the phrase themselves. When asked, one of them said it meant odds of about 80 to 20 (or four times more likely than not) that there would be an invasion, while another thought it meant odds of 20 to 80 — exactly the opposite. Other answers were scattered between these extremes, despite all having written the report together.

This floored Sherman Kent, who headed the NIE report. You see, something that is “possible” has a likelihood ranging from almost zero to almost 100%, but that’s not helpful. Kent suggested that we should narrow the range of our estimates to better communicate our forecast. Not only that, we should also assign a numerical value to our estimates, so as to avoid any ambiguity. He suggested the following:

  • 100% → Certain
  • 93% (give or take about 6%) → Almost certain
  • 75% (give or take about 12%) → Probable
  • 50% (give or take about 10%) → Chances about even
  • 30% (give or take about 10%) → Probably not
  • 7% (give or take about 5%) → Almost certainly not
  • 0% → Impossible

Words like “serious possibility” suggest the same thing these numbers do. The only real difference being that numbers make it explicit, thereby reducing the risk of confusion. Bingo! Sherman Kent is often described as the father of intelligence analysis, and not without good reason.

Now, how can we translate this into an office setting? First, if we state our strong opinion as a probability judgment, we are forced to calibrate the strength of our opinion. In other words, we are forced to let go of the ‘yes, no, maybe’ dial in our head.

Not only that, by framing our forecast/conclusion/opinion/prediction as a bet, we suddenly have skin in the game, and are motivated to get things right from the beginning.

When the boss replaces “there is a strong possibility that we can increase our revenue by increasing our prices” with “I’m 65% sure that we can increase our revenue by increasing our prices,” light bulbs go off everywhere. Now everyone can look into the variables involved, consider the case from the boss’s point of view, and weigh them against their own conclusions.

If the probability seems high, the line of enquiry flows effortlessly into exploring why their mental model differs so much, and how to synchronise them. This way, we can argue on principle instead of person.

But many people struggle to view the world in terms of estimates and probabilities. For example, if we say that wearing a mask makes us 70% less likely to accidentally infect someone, it won’t be a strong enough argument to motivate people to change their behaviour. To motivate people to change the status quo, we must eliminate all uncertainty.

People don’t understand probability and hence it’s better to not use them unless absolutely necessary. Saying something like, “There’s a 75% chance this company’s gonna go down, but let’s give our 100%,” isn’t likely to get the team motivated.

Truth is, we love catchphrases. We love drama. Poetry moves people. We should use language appropriately when we need to rally people’s support for a cause. But when we want to bring people together to make an informed decision, stating predictions in terms of estimated probability is a quicker and more effective way to draw people into the process. Because when it comes to finding the truth, it’s better to be a bookie that a poet. I’m 90% sure about it.

Show Comments