We live in a modern information age of big data, enormous computing power, massive quantities of statistics, facts and figures at our fingertips and hundreds, if not thousands of analysis tools to make sense of this material. You’ll find bar charts for this, pie charts for that, radar graphs, histograms, doughnut wheels, Venn diagrams, scatter charts, bowtie analysis and Monte Carlo simulations. There are almost as many ways to cut, slice and dice information, as there are sources for this information. It has become very easy to over-analyze a situation, so much so that the decision we should have taken, is often never actually taken due to getting bogged down in the details and ultimately paralysing the outcome. Whether you call this analysis paralysis, a paradox of choice, overthinking, or just plain indecisiveness, the impact is obvious – anxiety, stress, lowered levels of creativity and productivity, and a feeling of being exhausted and overwhelmed. And to top it all off, you still aren’t any closer to finding a solution to the original problem.
Over the years, risk management has becoming increasingly reliant on statistics, measurements, indicators, values, gauges, signs and markers. Risk managers are encouraged to analyse key risk indicators, examine contributing factors, measure and map consequences, evaluate historical volatility, and even to come up with quantifiable risk metrics to run through scenario planning models with a view to predicting future events. The computational and empirical methodologies involved in sound risk management have been well documented and discussed, but are we neglecting one important factor in an endless quest to dive as deep as possible into statistical certainty? What happened to good old common sense? Using statistics makes an assumption, and in my opinion a flawed one, that the future is a mirror of the past. A risk manager needs to have an unwavering belief in the impossible. Just because it hasn’t happened before, doesn’t mean it won’t ever happen.
At a recent IRMSA (Institute of Risk Management South Africa) breakfast in Windhoek, Namibia, Sven Thieme, the Executive Chair of the Ohlthaver & List Group, Namibia’s largest privately held group of companies, gave an impassioned presentation on the seeming lack of common sense, not just in the field of risk management, but as an observation of life in general. One only needs to take a look at instruction manuals for most everyday products to observe how common sense seems to have made an exit from many people’s realm of thinking, “don’t insert fingers in toaster”, “never use a lit match or open flame to check fuel level”, “baby carrier for use with infant only” or my favourite “contains nuts” on a tin of salted peanuts.
“Common sense is not so common”
Voltaire
Josh Billings (the pen name of the famous 19th century author Henry Wheeler Shaw) spoke about common sense as “the ability to see things as they are without prejudice and do things as they ought to be done without influence of any kind”. The Merriam Webster Dictionary defines common sense as “sound and prudent judgement based on a simple perception of the situation or facts”. And this is precisely why over analysis as discussed above tends to crunch common sense under its heavily weighted boot when performed in isolation. I know scientists and statisticians will have a real go at me for saying this, but there are times when we need to intentionally limit the amount of information we take in. And in order to do this, we have to have a very clear and concise understanding of what our main objective is. A well-defined knowledge of the end objective assists you in being far more shrewd and decisive, because you can immediately assess options that will help to realise that objective, and not waste time on information that just isn’t relevant. Sometimes all it takes is reaching out for someone else’s opinion. Discussing your objective and challenges with someone else often forces you to synthesise the information collected into a simple, succinct format, as well as offering validation or refutation of the solutions you’ve come up with. When this is the opinion of someone you respect, it can go a long way towards overcoming the doubt experienced when attempting to tackle the problem on your own.
Software developers sometimes refer to the iterative method to approaching challenges, which I believe can be incredibly useful in the field of risk management. Occasionally a company will release a product that is minimally viable, in other words an imperfect product, launched as quickly as possible, sometimes to a limited marketplace of customers. They will then focus on user feedback in order to test conjectures, identify what works and what doesn’t work, and incorporate this feedback into future versions of the product. There are times when, no matter how much research you do, you’ll only find out how effective something is when you try it out in the field. You can then use this teething process with a view to developing a product that fits far better with what the end user requires, than had you tried to create a solution that you think is perfect for that end user. Acknowledging the existence of things we don’t know ensures our common sense decisions are still thoroughly questioned and tested.
I’m not saying that risk managers should abandon all scientific methods and stick to their gut feel when making decisions. There is a fairly big difference between common sense and intuition. Intuition is knowledge that comes without necessarily having the benefit of our five senses. It is a deeper wisdom and insight that seems to come from somewhere outside of our conscious brain without fully understanding why, whereas common sense is more of a logical deduction based on how we perceive the situation at hand. Both form an important part of decision making, but common sense and intuition become that much more powerful when backed by adequate facts and figures (avoiding the analysis paralysis discussed above), and asking yourself the right questions. As the definition states, common sense is based on a simple perception of the situation or facts, the key word being simple. Constantly asking the question “How do I know that?” is a great way to make sure you are using common sense.
“Believe nothing, no matter where you read it, or who said it, no matter if I have said it, unless it agrees with your own reason and your own common sense.”
Siddhartha Gautama Buddha
Author – Paul van der Struys
June 2017