This article is part of one of Wilco Koorn’s “42 Years of Programming” series of short stories. Wilco Koorn is a senior developer at Trifork retiring after a long career in the software industry. During this series, he discusses 42 lessons and revelations he’s had throughout his career as a programmer. This series alludes to “The Hitchhiker’s Guide to the Galaxy” –– a comic science fiction series by Douglas Adams –– whose first novel revealed the number 42 to be the “Answer to the Ultimate Question of Life, The Universe, and Everything”.

One of the biggest lessons I’ve learned in recent years is the existence of cognitive biases and how they influence decisions made in software architecture. More importantly, how they lead to suboptimal decisions.

Here’s an example I’m sure you’re aware of: when a new team is formed –– or when the team gets a new member –– you experience the “In my previous project…” debate. Arguments consist of why technology X is to be preferred over technology Y based on people’s previous experiences with these technologies. Many forms of these debates exist, with people suggesting the use of Mercurial over, for example, Git, or Bamboo over Jenkins. The list of preferences is endless –– especially when you take the various versions of each technology into account.

There is a good reason for these discussions: people truly believe that their opinion is better. Few people, however, realise that these choices are influenced by cognitive biases. Their positive or negative experiences with these technologies may influence their preference for them, yet these preferences are not wholly based on facts.

My favourite biases are:

  • Anchoring”: The tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions.
  • Confirmation bias”: The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions.
  • Law of the instrument”: An over-reliance on a familiar tool or method, ignoring or under-valuing alternative approaches. 

I find these biases fascinating (and yes, I, too, have made a few suboptimal decisions as a result of these biases). Yet, once you acknowledge their existence, you will see these cognitive biases everywhere.

So, what can you do? Start a discussion at the meta level. Talking about the presence of cognitive biases will help you return to the debate with better decision-making capabilities.