The Psychology of Risk – Don’t Let it Pervert Your Insurance Choices
How How to protect the organization from rare but severe occurrences
by Frank Licata, Licata Risk Advisors, Boston (bio)
With weather events swirling, and cyber hackers swarming, we need to keep a clear head and have a coherent risk strategy.
The 2017 Houston flooding highlighted a problem common with “Black Swans.” A Black Swan is an outlier event whose occurrence isn’t likely, and whose effect is much larger than the run of the mill events that happen constantly (1). Risk managers have always been concerned with severity vs. frequency. The severe, though rare, event is the more important. Psychology and wrong incentives, however, cause business managers to focus more on the frequency. This is understandable; however it is a mistake.
Really frequent loss- causing events are a cost of doing business. They are not even insurable on a basis that makes business sense. This is the dollar-trading fallacy. An insurance company will be glad to take your premium dollars as long as the premium is at least 165% of the average annual loss. Somewhat frequent losses are the ones that receive the most attention from CEOs and CFOs. These are the losses that don’t occur every day, but do happen enough and cost enough to be a concern. These are the losses your insurance broker will make sure are covered, and these are the areas where your loss control efforts will go.
What about the truly severe events? These go unmanaged unless there is a focused risk management culture. This is where many companies are exposed. This is the area where events happen that bring companies to their knees. Company owners and managers have a way of brushing aside concern for the rare but severe event, except for some vague unease about it in the back of their minds.
Why does this happen?
First, there are some psychological phenomena that cause us to use faulty judgment. We judge the probability of an event happening in the future by how readily we can bring it to mind (the “Availability Bias”), or by how recently it has happened (the “Hindsight Bias”). The “Bystander Apathy Effect” allows us to waive off concern in a group if no one else in the group raises it. One other example is the “Problem of Induction.” With inductive reasoning we project into the future based on events we have observed in the past. If it hasn’t happened to us, we assume it won’t happen.
Next, there are sometimes perverse incentives in operation. Your insurance broker’s incentives are oriented toward ignoring the severe risk. Brokers need to move out policies and they need to have happy customers. They can’t get bogged down in what seems like irrelevant talk about events that hardly ever happen. They cannot be expected to critique the terms and conditions of their own product, except with respect to losses they know are bound to happen in the short term, which they emphasize in their proposals. Finally, Black Swans happen so rarely that if it does happen and they lose a customer, it’s only one! For a good example see There’s a Scandal Brewing in Commercial Property Insurance (https://licatarisk.com/cms/theres-a-scandal-brewing-in-commercial-property-insurance/ ).
The insurance products described in that article are the epitome of the “don’t worry about it – it will never happen” syndrome. This is not a criticism of brokers; this is the structure of the insurance marketplace.
CFOs can get caught in the short term thinking as well because they are too busy, or they may plan on being with the company for only a short time. For owners: make sure your incentives are arranged so that your CFO is attuned to the severe risk as well as the somewhat frequent risk. The more thoughtful CFOs, or the ones encouraged by their bosses, are just as busy but they know they can outsource risk management and should.
Owners, the CFO should have the same thought process as you do re the long term survival of the firm
Let’s look at a case study. Our firm has researched the 2010 BP oil well blowout disaster extensively, reviewing all the government investigative reports, the ones by industry groups, and BP’s own analysis. The disaster cost $60 billion and took 11 lives. Days before the event, BP received a safety award (for activities aboard that very same rig) from the Minerals Management Service, the agency in charge of oversight at the time. This was not just an odd quirk. This is constantly happening in business operations of all kinds. The focus is on the somewhat frequent events, while managers are oblivious to the weak signals of much bigger problems brewing beneath the surface. Frequency is easier to manage than severity because it is visible, and there is immediate feedback as to whether it is being managed. BP actually had abundant warning that the well was getting out of control, but the culture was focused so much on cost and speed, and so little on risk management, that the company was somehow able to ignore one sign after another. See The BP Gulf Oil Spill: a Risk Management Debacle (https://licatarisk.com/cms/the-bp-gulf-oil-spill-a-risk-management-debacle/ ) for the full risk management story of BP.
Managing severity isn’t that hard; it does take a risk management culture, though. Severe events don’t happen suddenly without warning. It just seems that way because low volume signals aren’t recognized and acted upon. There is lots of apparent noise in the operations of any organization. Some of it is just that – pure noise. But, some of it is not noise at all, but rather weak signals of trouble brewing. Being mindful enough to see the difference is the essence of managing severe risk.
The first faint whiff of smoke is a warning signal of something bad about to happen. We know that smoke precedes fire, and not many of us ignore it. Similarly, other things are constantly going wrong in an organization, and many weak signals like smoke are presenting themselves. Busy executives brush them aside until they are severe enough to worry about. Sometimes then is too late. Mindfulness is the word used by so called “High Reliability Organizations” (HROs) (2) to describe the ability to distinguish the important from the unimportant weak signals.
Here’s another phenomenon: Safety rules have been instituted in companies far and wide. These are the rules of OSHA, other government agencies, insurance companies and loss control experts. These rules almost always call for redundancies and safety margins in all operations. But disasters happen anyway. Why? In practice the margins are not always completely observed; there is cheating going on in the interest of speed and cost, but usually still nothing happens. If cheating on the tolerances caused a disaster every time, the cheating would stop. The few times the disaster does happen, something else is at work.
Workers know they can hedge a bit– they know the margins are there and they shave them with no ill effect. But sometimes on the same job another margin will get shaved; and maybe a third. The defects are additive and/or multiplicative, and the cumulative effect is a disaster. For example, despite the heavy safety oversight, cranes continue to collapse. For discussion purposes assume three safety factors: a weight capacity on the material being lifted, a level base, and low wind speed. Slightly exceeding the limit on any one of these can be tolerated, but all three at the same time will cause the collapse.
Two HRO principles would apply to this situation to scope out the confluence of risks, the combination of risk factors, that otherwise goes unnoticed. Management “sensitivity to operations” would cause there to be a risk management presence at ground level (“operations level”); and “deference to expertise” would cause the risk management view to be the dominant view in such a situation.
Frequency vs. severity thinking should apply to the purchase of insurance also. Non risk managers put severity way in the back of their minds, and their insurance brokers are more than happy to go along. People take comfort from the fact that this kind of event or that kind of event “hasn’t happened here in 20 (or 30, 40, 50– plug in your own number) years” That kind of statement is faulty logic. The severe events don’t happen to any single person or company with that kind of frequency. Our own, singular, experience base is far too small to have any credibility. Only insurance companies, and depending on the severity only the larger insurers, have the critical mass to create models that utilize the “frequency of severity” as a workable program. For the individual company, thinking that way is nothing more than an excuse to ignore the problem (or a defense mechanism if the loss has already happened).
Understand the problem of the Black Swan, the psychology and incentives behind it, and the way to manage it, and you’ll be in the top 20% of businesses. Have a risk management culture and obtain the risk management resources, either in-house or on a consulting basis.
(1) See The Black Swan by Nassim Taleb.
(2) Organizations like elite military units, nuclear power plants and hospitals are called “High Reliability Organizations” because of the immense importance of risk management to them. HRO principles and procedures can be categorized as 1. Preoccupation with failure; 2. Reluctance to simplify; 3. Sensitivity to operations; 4. Deference to expertise; and 5. Commitment to resilience.
See Managing the Unexpected by Weick and Sutcliffe.
© 2017 Licata Risk & Insurance Advisors, Inc.
Sep 06, 2017