About
Subscribe

Five cognitive biases to recognise and resist

The Business Continuity Institute's Middle East Conference will be taking place in Doha, Qatar on 11 and 12 May, and on day two of the conference, Howard Mannella MBCI, Managing Principal at Alternative Resiliency Services Corp, will be speaking on the subject of unforeseeable risk.

Johannesburg, 13 Apr 2015

Hey, we're all professionals, right? We are the experts on risk, right? We study, we read books... we even read blogs and fly to conferences, right? But! We oftentimes get it wrong - we miss risks and threats that we should have seen coming, thereby losing our stakeholders' confidence; or, we focus our time, our resources, and our 'political capital' in our organisations on risks and threats that didn't deserve our attention and our stakeholders' faith in us. Why?

Of course, that's partly the nature of our business. Risks and threats are not guaranteed. Probability will always play a part. Otherwise our investments would always pay off and we'd all be Wall Street millionaires.

However, we as humans come psychologically hard-wired (each to different degrees) with certain cognitive biases that influence our understanding of probability and risk. The first step is to recognise these biases; the second step is to build frameworks and controls into our programmes to correct for them. This can make our risk management and business continuity programmes more accurate, more targeted and better able to meet our organisations' needs.

Here are several cognitive biases to recognise and resist:

The anchoring fallacy

This is the tendency to stay focused, or 'anchored', on a previously-understood risk to the exclusion of its current state or evolving threats. Current airport security illustrates this misperception. Because of 9/11, box-cutters and other small sharp instruments will be forever banned for all passengers. Theoretically, compensating controls such as hardened cockpit doors, armed pilots, flight marshals, do-not-open and do-not-negotiate protocols (plus an aware passenger population that will fight back) have driven the success probability of a small-knife hijack attack to zero. However, security is still 'anchored' on pocketknives and box-cutters.

Why is this relevant to business resiliency? Many companies base their programme on: "Remember that incident in 19XX? That's why we invest in business resiliency!" Good to have management that sees the need, but it's up to us as risk professionals to keep the programme current by ensuring that we contemplate the evolution of risks and factor in emerging threats and global trends. The only guarantee in our business (besides death and taxes) is that the next event will be different from the last event.

Normalcy bias

This is the tendency to discount the probability of events where the judgment user has no experience or familiarity with the type of event - it's not 'normal' to them. I had a discussion with an executive of a Seattle-based company, who said: "Why should we contemplate earthquakes in our risk management? It's not like we're based in California!" Seismically speaking, the Pacific Northwest is one of the highest seismic threat regions in North America, if not the world (the Pacific Ring of Fire and all that). However, quakes were not in this exec's experience, so they were not 'normal' to him/her. Ask a native of Florida and a new arrival about the threat of hurricanes and you will get different answers.

Zero-risk bias

This is the tendency to prefer total elimination of a risk to a higher overall reduction across a threat surface. Given choices for actions or controls to reduce risk, people will more likely select an action or control that results in elimination of a five-point risk over one that results in a one-point reduction across six risks.

Availability bias

This one's almost the converse of normalcy bias: the tendency to overestimate risks of vivid or publicly-prominent occurrences or occurrences that are psychologically 'available'. People are acutely aware of high-profile tragedies such as child abductions or airline crashes, and therefore are wary of them. In reality, they are (thankfully) rare and lower-risk than perceived.

The odds of a flight ending in a crash are about 10 000 000:1, and 95% of people involved have survived - if one boards a plane, one will most likely walk away from the flight. Yet, how many companies still have policies prohibiting multiple executives from flying together? How many companies have no policy around executives sharing a car, where the probability of accident or casualty is far greater?

The Texas sharpshooter fallacy

This one's one of my favorites, and very relevant to business resiliency. A Texan shoots at the side of a barn, runs up and paints a bull's-eye around it, and exclaims: "Yee-ha, I'm a sharpshooter!" This is the fallacy of starting with a conclusion and working backwards to a preconceived hypothesis.

Why is this relevant to how we manage risk? Many executives point to an event's outcome and use it to justify whether their investment (or non-investment) in business resiliency is warranted. I had a discussion with an executive whose European headquarters experienced a two-alarm fire (smoke condition). The building had to be evacuated for about an hour. Thankfully, they were able to return. His assessment was: "Yes, we evacuated to our meeting point and waited for an hour, but we were able to return so it was not really a business resiliency event."

My assessment back to him was: "Okay, so this company was forced out of its European headquarters, lost all vital records, lost all technology assets, work came to a halt and hundreds of people's productivity was reduced to zero, and at 45 minutes into the event you did not know when you were returning, you did not know if you were returning, and had no contingency plan for non-return?"

His rejoinder? "We could see the building from the meeting point, it was still there!"

Business resiliency is not about the last non-material risk, it's about the next, perhaps-material risk.

There are many cognitive biases, such as zero-choice bias, base-rate bias, confirmation bias, choice-supportive bias and others. Some are highly relevant to risk management and business continuity. All are proof that there are factors impeding our understanding of risk and probability and therefore resiliency. The takeaway is that prudent next-generation business resiliency understands the existence of these biases and takes actionable steps to minimize their effect and work around them.

How can we make our programmes immune to these factors? See you at the conference!

To learn more about what Howard Mannella has to say about cognitive bias, come along to the BCI Middle East Conference. There is a packed programme of activities throughout the two days of the conference, so to find out more, or to book your place, click here.

Share

Editorial contacts

Andrew Scott
The Business Continuity Institute
+44 (0)118 947 8241
andrew.scott@thebci.org