top of page

Checking in with our biases

We all think differently and randomly. Ever wondered why?

Let’s put our brains to work! How long does it take you to figure out each of the below problems:

Problem A:

Which lady is unhappy?


Problem B:

What is 27*18?


A: probably a fraction of a second

B: Well, I’m still working on it…


In Thinking, Fast and Slow, Daniel Kahneman, Nobel Prize winning psychologist and author explores this problem solving difference through two systems of thinking:


System 1 is fast, automatic, and intuitive. It operates effortlessly, handling routine decisions and everyday tasks with mental shortcuts based on past knowledge and experiences. This is how we solved Problem A.


System 2 is slow, deliberate, and analytical. This system requires conscious effort and is activated for complex tasks and decisions, like the above math problem.

In most daily situations, System 2 automatically endorses the quick judgments of System 1, but in a few cases (like the math problem) System 2 has to take the lead with the time and effort to solve it.

This is an amazing setup since we get maximum performance with least effort. Yet, many of us overly rely on the effortless System 1, leading to an underuse of the analytical System 2

This imbalance results in biases based on System 1's instant reactions, influenced by past experiences rather than logic or facts.

These biases can cause limiting beliefs, overconfidence, poor judgment, and irrational choices. Here are some examples:

  • People react differently to a hypothetical disease diagnosis of 90% survival rate and a 10% mortality rate – the latter sounding more dire than the former, when the fact is that they are both the at the same risk level.

  • The intensity of losing $100 feels more painful than the pleasure of gaining $150.

  • After watching news reports about airplane crashes, people might overestimate the danger of flying, even though statistically, driving is far more dangerous.

  • People with strong political views tend to seek out news sources that align with their beliefs and avoid sources that challenge them, reinforcing their existing opinions and creating an echo chamber effect.


Understanding how we think helps us recognize and reduce cognitive biases. Here are some practical strategies:

1. Slow down and engage System 2: To counteract biases, slow down and engage in deliberate, analytical thinking. Pause and reflect; seek additional information.

2. Use structured decision-making processes: Implement structured processes to reduce bias influence. Create checklists for high-stakes or complex decisions; use decision trees to map out possible outcomes and their probabilities.

3. Consider alternative perspectives: Actively seek out and consider different viewpoints to mitigate confirmation bias and overconfidence. Assign someone to play devil’s advocate; work with diverse teams.

4. Pre-mortem analysis: Before finalizing a decision, imagine it has failed and work backward to determine the cause. This helps identify potential risks and weaknesses.

5. Encourage accountability: Make the decision-making process transparent, regularly review decisions, and provide feedback. This encourages more careful and deliberate thinking.

By incorporating these strategies, you can improve decision-making, reduce cognitive biases, and make more rational and informed choices.



bottom of page