Reducing Bias as a Product Manager

Thinking, Fast and Slow

Human brains are pretty amazing. We take in millions of pieces of information every second, process them, make connections with what we already know, and make decisions based on that. The way we manage that is by taking lots of shortcuts—we could not possibly consciously consider all of this information, so a lot of the information is processed by our subconscious.

Our subconscious is great at recognizing patterns and their relationships and reacting to predictable situations automatically. If you see the brake lights of the car in front of you, you step on the brakes yourself almost without conscious thought—you don't have to actively recall what the red lights mean, how that will change the speed of the car in front, and calculate that it would collide with your car if you didn’t also break.

The award-winning book “Thinking, Fast and Slow” by Daniel Kahneman calls this subconscious processing of information “System 1” thinking. System 1 is fast, intuitive, and emotional. In contrast, we can also employ System 2, which is slower, more deliberative, and more logical.

What makes System 1 great at quick decision making is that it uses a lot of shortcuts, pattern matching, and heuristics. If a situation looks similar to something you’ve seen before, then System 1 will make a decision similar to the previous times the situation was encountered. This has a lot of drawbacks, though. If making the best decision is more important than making a “good enough” decision, then System 1 will not yield the best results. Moreover, there are several systematic cognitive biases that are evident in human decision making:

Cognitivie Bias CodecCognitive Bias Codec
John Manoogian III, licensed under CC BY-SA 4.0.

Some of these biases can be overcome by just explicitly switching to “System 2”, in other words, reasoning through a decision, not taking the decision on “autopilot”. However, since System 2 also uses inputs that come pre-processed by System 1 (we can’t reason through these millions of pieces of information directly), even System 2 can still fall prey to these biases.

Understanding how our thinking is biased is an important skill for product managers for several reasons: tapping into our users’ and customers’ biases helps build more successful products, understanding the biases of team members and stakeholders helps communicate and align them more effectively, and understanding our own biases helps make better product decisions.

In this article, I will focus on the last aspect.

There are three steps to less biased thinking as a product manager:

  1. Employ System 2 deliberately
  2. Be aware of specific biases
  3. Counteract your biases

Employing System 2 deliberately means recognizing when decisions are important, and then making those important decisions deliberately by reasoning them through, rather than making snap judgments. Hopefully, most product managers already do this—even if you didn’t know about System 1 and System 2, you hopefully realize that important decisions are best made with more time and not just based on “gut feeling” (which is really just a euphemism for System 1).

The second step is being aware of specific biases, understanding in which ways our thinking tends to be flawed. Not all of the biases in the above picture are equally relevant—in the second half of this article, we will go deeper on a few specific ones. Once you understand the relevant biases, you can simply ask yourself “am I falling prey to bias XYZ?” and then employ System 2 to reason through whether or not your thinking is biased (and correct for that bias, if necessary).

The last step is counteracting the biases, putting measures in place to prevent being biased in the first place. In general, one of the best ways to mitigate cognitive biases as a product manager is to write down decision process and criteria before you start looking at the available information, but we will get a little bit more specific below.

Particularly relevant biases for PMs

IKEA effect

The IKEA effect means that we ascribe relatively higher value to products we partially created. This bias is extremely relevant in product management. We are much less likely to stop a project or remove a feature that we have put effort into building than if that same project or feature was someone else’s.

To counteract this bias, here are two tips: firstly, involve a neutral third party in making these decisions—in the ideal case, someone who doesn’t know the team at all (because otherwise, they might soft-peddle their advice to not hurt the team’s feelings). Secondly, write down decision criteria before you start building—for example, what KPIs you expect to be positively impacted by a new feature. Then, if the feature fails to improve those KPIs, it will be harder to fall prey to bias and find a reason to keep the feature in the product.

Confirmation bias

Confirmation bias means that we seek out evidence that supports our view, and disregard evidence that doesn’t. Especially in conjunction with the IKEA effect, this means that we are not nearly skeptical enough about the features that we build and the ideas we have. It also means that we might latch onto views about the product, the market, or our customers that are not true.

Confirmation bias can be quite hard to overcome, because you can’t take into account in your decision making what you have previously disregarded and filtered out. The best way, beyond just being aware of the bias and trying to be open minded that your view might be wrong, is to actively seek out evidence that goes against your view. You can do that by asking “what would be the signals that would tell me that I am wrong?” and then actively looking for those.

Sunk cost fallacy

Somewhat related, the sunk cost fallacy means the tendency to take into account already incurred (”sunk”) costs when making forward-looking decisions—often described as “throwing good money after bad money”. This happens all the time in product management. Consider, for example, a big project that was supposed to take 6 months. 4 months into the project, the biggest customer that was going to use the feature churns, or some new difficulty is discovered that will delay the project by another 6 months. The natural tendency is to say “we’ve already spent 4 months on this project, we can’t stop now!” That is the sunk cost fallacy at work. Instead, you should reevaluate the decision as if you were making it from scratch. Is the return on investment still there after these changes in circumstances?

To not fall prey to the sunk cost fallacy it helps to have clearly documented decision criteria, and to go back to these criteria whenever something material changes. If we hadn’t already invested time and money into this course of action, would we still choose it? And if not, then abandon it.

Availability heuristic / recency effect

Availability heuristic and recency effect mean that we tend to think that examples that come easily to mind, for example, because we recently encountered them, are more representative of the whole than they actually are. As an example, because we recently sat in on a sales call or read a couple of support tickets where a particular feature was requested, we believe that feature to be in high demand, even though it might not be.

Of course, it’s good to “keep your ear to the ground”, listen to customers, and stay reactive. However, if you always try to placate whatever latest concern you have heard, you will run around like a headless chicken and build a product that follows no consistent direction.

To avoid falling into that trap, it is best to be aware of this bias, and collect evidence for and against an assessment more systematically. If you have a data point that you collected recently, try to back it up with more historic data, for example how often the request was seen in the past. Also, having a clear product strategy, which sets priorities at the highest level, helps. A good strategy is created from a diligent analysis of the situation and the biggest challenges, hopefully unaffected by whatever happened most recently. Every product decision can then be evaluated against the strategy instead of just against any recent evidence.

Anecdotal fallacy

The anecdotal fallacy is closely related to availability heuristic and recency effect. Human beings have a natural affinity for stories (anecdotes). Stories affect us differently, more deeply than sheer numbers do, because they tap into our empathy for other people. For this reason, we put a great deal of emphasis on ideas and perspectives that are supported by anecdotes. We also more easily remember anecdotes than other facts, playing into the availability heuristic. Therefore, if someone tells us a story about how they had a particular problem with our product, we will likely want to fix that problem even if the data tells us that this is a rare occurrence.

The remedy to this fallacy is similar to the availability heuristic / recency effect: collect evidence systematically. In addition, specific to the anecdotal fallacy, we should overweight quantitative information in comparison to qualitative anecdotes, because our human tendency is to do the opposite. Now, I am not saying we should rely only on quantitative data – in fact, that would be a big mistake. However, if we have some anecdotes saying one thing and data saying the opposite, then we should probably listen to the data (and conduct qualitative data to find out the reason for the disagreement).

Hindsight bias

Hindsight bias is well known: “Hindsight is always 20/20”. It is our tendency to think that past events were more predictable than they actually were. As a product manager, this can prevent you from properly assessing risks and learning from wrong assumptions. To address this, you should turn “we should have known” into “how could we have known”.

The best way to do this is write down assumptions you are making and risks you have identified early on in the project, assess how likely you think they are and why, and then make plans to validate them. Overall, that is a good idea anyway and part of the lean startup “MVP” approach. It also allows you to combat hindsight bias. If some risk materializes that seems obvious in retrospect you can go back to your initial notes and see what you thought at the time. Did you not identify the risk? Then probably you didn’t brainstorm risks thoroughly enough. Did you see it, but give it a low likelihood? Then revisit your reasons you documented and why you thought so, and see if there was a fundamental flaw in your assessment that can be fixed. Did you validate the underlying assumption, but not thoroughly enough? Then perhaps you were biased in your validation or your validation technique was flawed, and you can correct that next time.

By comparing after-the-fact assessments with documented before-the-fact assessments, you can learn about flaws in your initial assessment and then take steps to correct and improve them over time.

Self-serving bias and fundamental attribution error

Self-serving bias and fundamental attribution error mean that we are more likely to attribute our own failures to circumstances, and others’ failings to their character (or abilities). These biases matter in a lot of situations for product managers: if we have issues using someone else’s product, that’s because of poor UX, if someone has issues using our product then they’re stupid. If we forgot to respond to a stakeholder’s email it’s because we had too many things on our plate, if someone else does it to us they are deliberately ghosting us. If we didn’t deliver the spec in time it’s because we got pulled into some emergency, if an engineer is late with their feature it must be because they are lazy.

For this bias, the biggest antidote is awareness, and consciously trying to come up with explanations for other people’s behavior that don’t assume failings in their character or abilities.

Curse of knowledge

The curse of knowledge refers to the fact that once you know something or have learned something, it is impossible to see things from the perspective of someone who doesn’t. This is very challenging for product managers: we are experts in our own products and know all the flows, interactions and idiosyncrasies. It can be really hard to understand the challenges that a more novice user might have with the product, because everything seems obvious from our perspective. The same thing is true for the tools and processes that we use – they can sometimes be hard to understand or inefficient, but because we’ve learned to live with them, we don’t see the flaws anymore.

The only way around the curse of knowledge is to be aware of it and regularly talk to novices who don’t have that knowledge yet. Another good practice is to have anyone who newly joins the team write down their impressions of the product and how things are being done for the first month or so, and then share that back with the rest of the team. These initial, unbiased views can often hold the kernel of really valuable improvement ideas.

Conclusion

Biases are natural, and all human beings are biased. They stem from constantly needing to handle more information than we can consciously process. Some of them, like the ones outlined in this article, can easily lead to making suboptimal decisions in terms of the products we build and how we work. Thankfully, many of these biases can be counteracted by being aware of them and taking deliberate steps to address the biases.

I hope you found this article useful. If you did, feel free to follow me on Twitter where I share thoughts and articles on product management and leadership.

Photo of Jens-Fabian Goetzmann

About Jens-Fabian Goetzmann

I am currently Head of Product at RevenueCat. Previously, I worked at 8fit, Microsoft, BCG, and co-founded two now-defunct startups. More information on my social media channels.

Share this post: