Hurricane Sandy raises again the psychology of black swan events

Hurricane Sandy raises again the psychology of black swan events

How the management should protect the organization from the rare but serious occurrence.

Why don’t so many businesses and individuals purchase flood insurance (as just one example of behavior that is inexplicable in terms of basic risk management)?

The 2012 Mega Storm event in New York and New Jersey exposed a common problem with “black swans”. A black swan is an atypical event whose occurrence is unlikely and whose effect is much greater than common events that occur constantly (1). Risk managers have always been concerned with gravity versus seriousness. frequency. The severe event, although rare, is the most important. However, the wrong psychology and incentives cause non-risk managers to focus more on frequency. This is understandable; however it is a mistake.

Very frequent loss events are a cost of doing business. They are not even insurable on a basis that makes good business sense. This is the fallacy of dollar trading. An insurance company will be happy to accept your premium dollars as long as the premium is at least 165% of the average annual loss. The somewhat frequent losses are the ones that get the most attention from CEOs and CFOs. These are the losses that don’t happen every day, but they happen enough and cost enough to be a concern. These are the losses your insurance broker will make sure are covered, and these are the areas your loss control efforts will target.

What about truly severe events? These are not managed unless there is a focused risk management culture. This is where many companies are exposed. This is the area where the events that bring companies to their knees take place. Company owners and managers have a way of putting aside worry about the rare but serious event, except for a vague trepidation about it in the back of their minds.

Why did this happened?

First, there are some psychological phenomena that cause us to use wrong judgment. We judge the likelihood of an event happening in the future by how quickly we can remember it (the “availability bias”), or by how recently it happened (the “hindsight bias”). The “Bystander Apathy Effect” allows us to give up worry in a group if no one else in the group raises it. Another example is the “Induction Problem”. With inductive reasoning we project into the future based on events we have observed in the past. If it hasn’t happened to us, we assume it won’t.

Then there are sometimes perverse incentives at work. Your insurance broker’s incentives are geared toward ignoring severe risk. Brokers need to change policies and they need to have satisfied customers. They can’t get bogged down in seemingly irrelevant conversation about events that almost never happen. They cannot be expected to criticize the terms and conditions of their own product, except with respect to losses that they know will occur in the short term, which they emphasize in their proposals. Finally, black swans happen so rarely that if it happens and they lose a customer, it’s just one!

The insurance products described in that article are the epitome of the “don’t worry about it, it will never happen” syndrome. This is not an opportunity for runners; this is the structure of the insurance market.

CFOs can also get caught up in short-term thinking because they’re too busy, or they may plan to be with the company for a short time. For owners: Make sure your incentives are arranged so that your CFO is attuned to serious risk as well as somewhat frequent risk. The most thoughtful CFOs, or those encouraged by their bosses, are just as busy but know they can and should outsource risk management.

Owners, the CFO needs to have the same thought process as you do regarding the long-term survival of the business.

Managing severity is not that difficult; however, a risk management culture is needed. Severe events do not happen suddenly without warning. It just seems that way because low volume signals are not recognized and acted upon. There is a lot of apparent noise in the operations of any organization. Some of this is just that: pure noise. However, some of this isn’t noise at all, but faint signals that trouble is brewing. Being aware enough to see the difference is the essence of serious risk management.

The first puff of smoke is a warning sign that something bad is about to happen. We know that smoke precedes fire, and not many of us are unaware of it. Similarly, other things are constantly going wrong in an organization, and many faint signals are present like smoke. Busy executives push them aside until they’re serious enough to worry about. Sometimes then it is too late. Mindfulness is the word used by so-called “High Reliability Organizations” (HROs) (2) to describe the ability to distinguish weak signals that are important from those that are not.

Here’s another phenomenon: Security standards have been instituted in companies around the world. These are the rules of OSHA, other government agencies, insurance companies, and loss control experts. These rules almost always require redundancies and safety margins in all operations. But disasters happen anyway. Why? In practice, margins are not always fully respected; cheating is being done for the sake of speed and cost, but still usually nothing happens. If cheating on tolerances caused disaster every time, the cheating would stop. The few times disaster strikes, something else is at stake.

Workers know they can cover up a bit, they know the margins are there, and they shave it off without any ill effects. But sometimes, in the same work, another margin is shaved; and maybe a third. The defects are additive and/or multiplicative, and the cumulative effect is disastrous. For example, despite strong safety supervision, cranes continue to collapse. For purposes of discussion, assume three safety factors: a weight capacity on the material being lifted, a level foundation, and low wind speed. Slightly exceeding the limit on any of these can be tolerated, but all three at the same time will cause a crash.

Two (at least two) HRO principles would apply to this situation to detect the confluence of risks, the combination of risk factors, that would otherwise go unnoticed. “Operation sensitivity of management” would mean that there would be a risk management presence at the ground level (“operations level”); and “deference to experience” would make the risk management viewpoint the dominant viewpoint in such a situation.

Frequency vs. Severity thinking should apply to buying insurance as well. Non-risk managers put severity in the back of their minds, and their insurance brokers are more than happy to accept them. People take comfort in the fact that this type of event or that type of event “hasn’t happened here in 20 (or 30, 40, 50 – insert your own number) years.” That kind of statement is faulty logic. Severe events don’t happen to a single person or company with that kind of frequency. Our own unique experience base is too small to be credible. Only insurance companies, and depending on gravity only the largest insurers, have the critical mass to create models that use “gravity frequency” as a viable program. For the individual company, thinking like this is nothing more than an excuse to ignore the problem (or a defense mechanism if the loss has already occurred).

Understand the Black Swan problem, the psychology and incentives behind it, and how to handle it, and you’ll be in the top 20% of companies. Have a risk management culture and obtain risk management resources, either internally or through consulting.

(1) See The Black Swan by Nassim Taleb. He may also want to read Taleb’s follow-up book called Fooled by Randomness.

(2) Organizations such as elite military units, nuclear power plants, and hospitals are called “High Reliability Organizations” because of the immense importance to them of risk management. HRO principles and procedures can be classified as 1. Concern about failure; 2. Reluctance to simplify; 3. Sensitivity to operations; 4. Deference to experience; and 5. Commitment to resilience. See Handling the Unexpected by Weick and Sutcliffe.

Leave a Reply

Your email address will not be published. Required fields are marked *