The Startup Pyramid of Psychology: How to See Blindspots & Discover Insights

Humans tend to subconsciously distort information and sculpt it to fit their existing beliefs. Confronting one’s own cognitive biases is critical when starting a company.

Most startups begin with an idea and then follow a series of winding steps including user research, design, launching an MVP — all on a quest to find product-market fit. Each of these steps can produce a ton of feedback; UX data, surveys, engagement info, etc and it can come from a diverse set of people — customers, employees, investors, even family and friends. The difficulty comes when deciding what to do with all this information. Faced with a deluge of feedback and opinion, founders struggle to analyze and weigh this information objectively. Most tap into a domain of subjective thought that psychologists Daniel Kahneman and Amos Tversky have termed ‘‘Biases & Heuristics.” These psychological distortions can produce blindspots in how you evaluate your product or service.

Blindspots

Israeli Psychologist and Nobel Laureate Daniel Kahneman popularized the “biases and heuristics” concept in his 2011 book, Thinking Fast and Slow (which I recommend reading). His seminal work has influenced the field of Behavioral Economics and also offers some valuable and concrete insights for startup founders who want to make sure that they aren’t fooling themselves.

In the book,, Kahneman argues that people think in two “systems.” “System 1” is our default and intuitive method of thinking — the kind of thinking that feels effortless and fluid, but too often takes sloppy mental shortcuts which lead to dead ends and biased thinking. Kahneman defines “System 2” as our analytical and reflective method of thinking. It requires more effort on our part and, as a result, we don’t always call it into action when we should.

Kahneman’s insight has a lot to teach us about starting a company. Founders have to deal with an objective reality where self-deception is fatal. Employing system 1 thinking makes it far more likely that one will reach biased conclusions and lead to unfortunate outcomes. Below are three of the major biases from Kahneman’s work that I have seen most often impact early stage startup thinking. (Reviewing these three biases and trying to evaluate your own thinking can act as a sanity check and help you make sure that you are giving your reflective ‘System 2’ a chance to catch up with your shoot-from-the-hip, impulsive, ‘System’ 1 brain. If you want a guide for working through these ideas and applying them to your business, check out this worksheet I put together: Blindspots & Insights Template).

Definition: The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person’s mind when evaluating a specific topic, concept, method or decision. The availability heuristic operates on the notion that if something can be recalled, it must be important, or at least more important than alternative solutions which are not as readily recalled (Wikipedia: https://en.wikipedia.org/wiki/Availability_heuristic).

The ‘availability bias’ leads many founders to dramatically overvalue their most recent user interviews, customers they interact with most frequently, a news article they just read in TechCrunch, etc. In the Blindspots and Insights Template that I shared above, I included a user interview matrix so you can score feedback over time and avoid placing too much weight on recency.

Definition: The representative bias describes our intuitive desire to group objects together based on perceived logical patterns. This is effectively the process of stereotyping (and as we know from life experience, our stereotypes often prove to be very wrong). The reason ‘representative bias’ is so dangerous is it leads us to incorrectly evaluate ‘base rates’ for any given situation. This means that we often intuitively assign higher probabilities to less likely cases than more likely ones because of the mental models and associations we have ingrained in our minds. This bias is highly problematic and essential to correct for when we move to the section on ‘measuring, predicting and updating’.

Example: The most famous illustration of this fallacy is Linda the Bank Teller case. It illustrates the flaw in our thinking about ‘base rates’ very well. In 1983, Tversky and Kahneman asked participants to solve the following problem.

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Which is more probable?

More than 80 percent of participants chose option b, regardless of whether they were novice, intermediate or expert statisticians. However, the probability of two events occurring in conjunction is always less than or equal to the probability of either one occurring alone. Compare the Linda case to the following case: what is more likely: that (1) you will have a flat tire tomorrow morning or that (2) you will have a flat tire tomorrow and that a man in a black car will stop to help you out. In this case, it should be evident that (2) is not the most likely outcome.

One explanation of why we commit the conjunction fallacy in cases like Linda the Bank Teller case is that we incorrectly employ what Tversky and Kahneman call the representative heuristic. Even though logically, we should not pick option 2, we consider option 2 more likely to be correlated with what Linda did in college. With her special background, Linda is representative of someone who is a feminist. (https://www.psychologytoday.com/us/blog/the-superhuman-mind/201611/linda-the-bank-teller-case-revisited)

This bias is the root of so many errors in judgement and forecasting future events and their likelihoods. Remember that probability is not the same thing as plausibility.

When it comes to startups, ‘representative bias’ is one of the worst offenders. It can often block us from finding unexpected insights because it locks us in our beliefs and expectations about how we group cohorts of people together and prevents us from looking at things for new perspectives that don’t conform to our existing mental models.

Definition:

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms one’s beliefs or hypotheses while giving disproportionately less attention to information that contradicts it. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs. People also tend to interpret ambiguous evidence as supporting their existing position. Confirmation biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. Poor decisions due to these biases have been found in political and organizational contexts. (https://en.wikipedia.org/wiki/Bias)

Example: Current political polarization is a great example of confirmation bias. Tune in to MSNBC and then Fox News and watch how differently they present the same set of facts to support two different narratives.

If employing ‘representative bias’ is like playing Russian Roulette, then ‘confirmation bias’ IS drinking arsenic. The emotional attachments that people develop for their ideas and perspectives can be very powerful. When you start hearing expressions like “we need to educate the market” or disregard for certain recurring feedback because it’s just ‘wrong,, then you know you are in trouble. The best antidote for ‘confirmation bias’ is keeping an open mind and consciously trying to create alternative narratives and possibilities.

This is just a brief sampling of the major biases that psychologists have uncovered in recent years but they are the most common stumbling blocks I’ve encountered both as a founder myself and as a mentor to others.

Disclaimer: It is very common for founders to confuse efforts to reduce bias with a mindset heavily focused on avoiding wrong decisions. This can lead to analysis paralysis and is not productive. Efforts to reduce bias do not mean that we need to feel certain before trying a new idea or that we should feel scared of being wrong. If anything, the opposite is true. As you will read in the next sections, the faster that we can generate feedback and new information, the sooner we can update our beliefs and get to a good answer. Forecasting & insight both use information as fuel.

Superforecasting (Measure, Predict, Update)

Now that we have an understanding of the two-system (Intuitive; System 1, Analytical; System 2) cognitive model created by Daniel Kahneman and are aware of the blindspots it can produce, we can turn the question of how to systematically correct for these errors.

A great place to look in our search for a framework that will help us reduce bias in our thinking is in the work of writer Philip Tetlock. He is one of the co-founders of The Good Judgement Project and author of an exceptional book called Superforecasting: The Art and Science of Predicting the Future. Superforecasting tackles the subject of whether certain people are better at predicting the future and if so, what common attributes these people share. It also asks what kinds of events are ‘predictable’. The answers to these questions, it turns out, are heavily connected to the cognitive biases discussed in the last section. The Good Judgement Project in conjunction with the US Government through IARPA (Intelligence Advanced Research Projects Activity) launched a geopolitical forecasting tournament that invited people from all over the world to make predictions about major events and compete with each other as well as against analysts in the CIA, defense department and other official organizations. The members of these government organizations had access to classified information that other forecasting tournament participants did not. Interestingly, however, these groups’ access to better information did not lead to more accurate predictions. Rather, a cohort of ‘superforecasters’ emerged from the tournament able to predict events 30% better than these analysts with classified information. Tetlock interviewed these superforecasters at length to understand their methods and thought process. He identified five core cognitive processes that correct for bias.

To answer this question using the Fermi Estimation approach, you might want to break this question into four easier questions.

1) How many people are in Chicago? 2) How many people have a piano out of every 100 people (a piano base rate)? 3) How many pianos can each Piano Tuner work on per day? 4) How often do people need their piano tuned?

By answering these four much more manageable questions, you can arrive at a ballpark estimate of the more complicated question. An added benefit of this approach is it acts as a forcing function to ensure you are using a coherent logical process to arrive at conclusions.

2. Was Yasser Arafat poisoned by Israel using polonium?

Here is a potential approach to breaking this question down (and keep in mind there can be other ways of doing it). We can start by establishing a ‘base rate’ for the question of how many political poisonings have taken place in the last 5 years. We could find out how polonium can end up in someone’s body and if there is a common source other than poisoning. We can find out how many assassinations have taken place in the past 5 years between Israel and Palestine. We could then aggregate these findings and shift from the base rate accordingly.

By breaking down complex problems into more simple questions with quantifiable answers and establishing data driven ‘base rates’ we are activating our System 2 thinking and correcting for representative and availability bias.

3. Constantly & deliberately evaluating alternative explanations and perspectives.

Another interesting finding from the IARPA tournament was that the superforecasters performed even better in groups than they did individually. When asked about the benefit of team collaboration, superforecasters pointed out that their teammates were able to challenge their existing beliefs and point out alternative possibilities for any event and that they always had an open mind and incorporated these alternative possibilities into their mental model.

There are two key lessons for founders here: actively seek contrary explanations and perspectives to any problem you face and work to foster a culture where respectful disagreement is encouraged. One useful strategy for encouraging respectful disagreement is through the use of ‘precision questioning’. This means making sure that questions asked to another person are specific to the claims being challenged and are framed objectively. This process will produce open-mindedness and will go a long way in combating confirmation bias.

4. Updating beliefs with new information. There is a famous quote — falsely attributed to the renowned economist John Meynard Keynes — that founders ought to adopt: “when the facts change, I change my mind”. While this insight is straightforward, it’s rare for people to update their beliefs when they are presented with new information. The power of confirmation bias is strong. In the IARPA tournament, superforecasters updated their predictions much more frequently than anyone else and also did so in smaller increments. The lesson here is to be conscious of new information and ask yourself what and how much it should shift your existing beliefs. Thomas Jefferson summed up this idea: “He who knows best knows how little he knows.”

5. Correcting for time scope. A study asked ordinary people to predict the likelihood of an event occurring over different time horizons (ie 1, 5, 10 years). One of the most interesting findings of this study was that their answers were almost identical regardless of the time horizon. Faced with the same question, superforecasters accounted for time scope adjusting their estimates to make probabilities higher the longer the time horizon. This doesn’t explicitly tie to availability, representative or confirmation bias but it just reiterates the lesson that the more often we can activate our System 2 thinking when working through our startup challenges, the less likely we are to succumb to subjective and bias-driven judgement errors.

Insights & Intuition

If building a company (or life) were only about error reduction that wouldn’t be too much fun nor very productive. So we’ve arrived at the fun part — uncovering insights. The third book which inspired this article is called Seeing What Others Don’t: The Remarkable Ways We Gain Insights by psychologist Gary Klein and is focused on the very intriguing puzzle of what drives useful intuition and the discovery of insights. In the book, Klein uses a metaphoric graph with two arrows to describe what makes people and organizations successful. The ‘down arrow’ focuses on the elimination of errors and the ‘up arrow’ focuses on the creative processes that deliver insights. So how do insights work? According to the 100+ examples studied by Klein, he developed the “Triple Path Model” of insights which defines the different methods of uncovering insights.

Environments that generate fast feedback and constant access to updated information are very friendly to achieving insight. You can download a paper co-written by Daniel Kahneman and Gary Klein on this topic if you’d like to read more about the conditions that help foster insight.

Unfortunately, it doesn’t seem possible to reverse engineer the specific sequences of actions that lead to insights. But that doesn’t mean you can’t foster an insight-friendly environment, one that is open minded to alternative perspectives, encourages the challenging of assumptions and is balanced between the reduction of errors and risk taking. I’ve found that simply being aware of this theory of insight has been helpful in making me more conscious how I can create a culture of creativity and not succumb to an extreme ‘‘error reduction’ oriented view of the world.

You’ll notice the section on insight here isn’t nearly as long as the section on biases and heuristics and to some degree that speaks to the magic of the human mind. Just because this section is shorter doesn’t make it less important and is reading some additional examples from the Triple Path Model of insights because they are both fun and fascinating.

Tying It All Together

Finding the right balance between eliminating bias and achieving meaningful insight can be the difference between an early stage startup finding product-market fit or not, a large company discovering a new, unexpected area for growth or generally the reason some companies continue to thrive over long time horizons (ie Amazon) and others don’t (ie Kodak).

The question of balancing these two forces takes two forms; 1) being able to eliminate your own biases by leveraging the ‘superforecaster’ toolkit or by other methods of conscious awareness, yet not being so fixated on this error reduction problem that you become stymied from insight and original thought and 2) being able to instill these principles into a wider team.

For all of the books on leadership, management, business process, etc, I would argue that the true recipe for building a great organization simply lies in finding the right harmony between the error and insight. All of those other concepts are just strategies for achieving this goal.

And now for one final observation. Even though the elimination of error and flashes of insight may appear to pull us in opposite directions, the observations of Gary Klein in his work on insights and Tetlock in his study of superforecasters (and their use of System 2 thinking) suggest that the conditions for achieving both goals are more similar than one might think. Challenge assumptions, welcome alternative perspectives, don’t shy away from observations that don’t match conventional wisdom.

Next time you think about your business, I hope you’ll take a moment to look at it through the lens of bias and insight.

All things startup and technology. Founder of https://www.scrumlaunch.com— A product development studio for high growth startups and leading brands.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store