Hindsight Bias is Everywhere

I recently ran into a great example of how pervasive hindsight bias is that I wanted to share. It appeared in some comments to an op ed I wrote which was published in the San Mateo Daily Journal, one of our local papers. My article argued the San Carlos City Council, on which I used to serve, should have spent more of its significant financial reserves to help the community through the COVID-19 pandemic.

Two different respondents took issue with the need to do so, arguing that, had the United States (and California) not responded inappropriately to the pandemic there would’ve been little or no reason to spend reserves because normal activities would not have been so disrupted. Here’s a summary of the argument:

  • Lockdowns were imposed when the pandemic started
  • Lockdowns are not particularly useful, or useful at all, as proven by objective scientific studies
  • Therefore, our elected leaders were guilty of gross mismanagement

On the face of it this is a plausible argument1. Case closed, right?

Now let’s timestamp the points:

  • March, 2020: Lockdowns imposed when the pandemic started
  • March, 2022: Lockdowns are not particularly useful, or useful at all, as proven by objective scientific studies
  • March, 2020: Therefore our elected leaders were guilty of grossly mismanagement

Now the argument is revealed as being fatally flawed, because it presupposes data was available in March, 2020 that only came into existence in March, 20222

This is a classic example of hindsight bias. Humans are notorious for not being good at keeping track of exactly when they learn something. Do you remember when you learned calculus? I don’t, not precisely. I know it was sometime between 7th grade3 and 9th grade4. But that’s a pretty big gap in time.

Once we know something we tend to act/think as if we’ve always known it. That’s not only generally not a problem, it’s usually an advantage: who wants to waste time validating when we learned something every time we use the knowledge? But it leads us to make fallacious arguments, when those arguments depend, as the one I’ve cited does, on the precise sequence of events.

But wait! There were people asserting back in March, 2020 that the lockdowns were too intense and/or unnecessary. In fact, as I recall, the former “very stable genius” who occupied the White House was one of them :). So the argument as presented is in fact valid, right?

Nope. Because there’s a huge difference between an assertion and a fact-as-determined-by-a-clinical-study. Even if they appear to be, or are, the same.

Humans are really bad at distinguishing between assertions and facts, particularly when the facts appear later in time than when the assertions are made. We automatically link the two, and assume the assertion was correct — and actionable — at the time it was first stated.

But, while later analysis (i.e., the clinical study) showed the assertion to be correct, its truth was not known at the time the assertion was made. To have acted upon the assertion as if it was a fact would have been risky, extremely so in the case of being faced with a highly contagious disease whose lethality was both ill-defined and significant.

It’s important to be aware of all forms of hindsight bias, including the two I’m exploring in-depth here. Because, like any biases, failure to do so can wreck arguments. Which is embarrassing, and potentially dangerous if you’re the decider.

I find a good way to do that is to always consider the provenance of data when I use it in an argument. Who said it? When was it proved/validated? How strong is the proof/validation? Considering all those factors helps me assess how good my argument is, and how likely it is to remain useful over time.

By the way, that last element of provenance I cited — how strong is the proof/validation? — shouldn’t be glossed over, even though I’ve generally found it to be the most difficult factor to assess. The difficulty stems from the fact that every fact is only contingently true. The best that can be said about a fact is that it hasn’t yet been disproven, despite efforts to do so. But it might be shown to be wrong, at any time.

This contingent nature of facts underlies the entire scientific worldview, and I’d argue is probably one of the greatest insights into the Real World our species ever stumbled across. It literally gave us all the technological wonders that have enriched us enormously.

But it’s not hard-wired into our genes. We each have to learn it. And doing so takes lots of time and effort — it’s one of the main reasons we invest so much time, energy and resources as a species educating our young.

But it’s well worth it. Because besides making us healthier and wealthier than our ancestors could’ve dreamed, it helps us avoid being embarrassed.


  1. Leaving aside I’m not sure, from my reading, that the studies they’re referring to showed no benefit. I seem to recall they showed some modest benefits. Then again, they might be referring to different studies than the ones I read about. 

  2. The studies might have been released slightly before March, 2022, because there’s usually a lag between when clinical studies are released and when the popular press picks up on them. On the other hand, any such study would be, and was, newsworthy so I suspect the lag was quite brief. 

  3. When I was introduced to algebra and one of my older brothers teased me by giving me his college introductory calculus textbook, saying “try this on for size” : 

  4. When I was using it to mathematically model electrical circuits.