Risk does not lie in the probability of an event but optionality. Risk should not be evaluated on rationality but optionality, meaning a heuristic calculation of gains and losses, always asymmetrical in a complex environment. Lessons given by Nassim Nicholas Taleb in his book Antifragile should also be considered for corporate risk evaluations as much as for all other cases that organizations and not individuals evaluate and prepare for deviations from normal.
Air transport safety case
All kinds of governmental and corporate policies should consider lessons from this simple case. Disproportionally large sums of money are invested into air transport safety even though already, at present, casualties and material damages caused by plane crashes are negligible compared to car accidents, home violence or various types of home accidents.
There are a couple of reasons for such disproportionality. The first one is obvious. Causes for later accidents are easily attributed to individuals, while air transport accidents are attributed to organizations. As Taleb says: Never travel by aeroplane without a pilot. And further: never travel in a plane without a copilot. Corporate redundancy matters when an individual is not in charge.
But there is also a second, more important reason behind such behaviour. People decide not based on truth but the basis of gains. Many small gains (faster, more comfortable, cheaper air travel) are eradicated by one significant loss: death. Truth says that air transport is much safer than any other means nothing since we decide upon gain calculation. For that reason, investment in air travel safety should be disproportionally significant compared to rational benefits gained from such investments.
Implications for fake news theory:
Since fake denotes something that is not true, it is easy to see why we do not care if something is fake. We do care if we cannot gain from something. That explains why the majority, due to collective intellectual nudges, rationally despise fake news but at the same time behave following fake if they evaluate that fake benefits them. Fake news is an intellectual fallacy that makes media run by intellectuals happy as much as Thanksgiving turkey is extremely happy a day before the execution.
Why is air travel risk not comparable to nuclear?
But then, it is not so easy to deal with gains. For instance: why do all but only a few accept travelling by planes while, on average only half of the population accepts nuclear power? Both are comparable in causalities and material damage. Nuclear disasters caused a negligible death toll compared to every other electricity production source for a reason similar to air transport safety. Disproportionally large sums of money and expertise were put into nuclear safety, perhaps even more than air travel safety.
Asymmetry of gains
However, there is one crucial difference between air transport and nuclear power plants’ risk evaluation. It lies in asymmetry distribution for both cases. While there is no comparable alternative to aeroplanes (clear benefits), ample options exist for electricity production. Comparative advantages of nuclear energy are replaceable by less competitive but, at the same time, less risky alternatives.
Please note that when risk evaluation is at stake, we cannot seriously consider some objective risk but only the risk perception. Objective risk exists but is of no value for humans since they do not evaluate risk rationally but based on calculated benefits. Benefits cannot be but personal (subjective), so risk evaluation on a personal level cannot be but heuristic, non-scientific subjective evaluation of gains and risks.
What about global warming risk?
But then some exciting heuristics comes out if we compare the perception of nuclear disaster risk and global warming risk. Suppose global warming risk is high in the perception of gains and losses. In that case, people should go for a nuclear solution since the alleged global extinction of civilizations due to the irreversibility of global warming presents a much higher risk and much more harmful losses than a still localized possible nuclear accident.
My interpretation of why we do not evaluate global warming as a considerable risk goes in three directions:
A: We usually behave and do not care for possible truth about warming but evaluate our gains only. In our perception, global warming is not a rare event (black swan) but something that will or will not come by definition and not by chance (like other catastrophes). So, we evaluate nuclear disaster as a possible black swan while global warming represents either something not substantial or something like death, inevitable.
B: While intellectuals believe that they can learn birds to fly, ordinary people with heuristic (evolutionary) knowledge know that such a notion is not so false as stupid. But since they have gone through rationalistic education, they have to express their belief (rational) according to the majority belief. They support the fight against global warming in words (rationally) while they act as if global warming presents no danger.
C: Since we do not produce energy personally and since, by experience, electricity is almost 100% available, we cannot feel involved in energy production. We thus feel no risk of supply. Since we know that most of us cannot replace aeroplane pilots, we feel no risk anymore after accepting the initial promise of safety supported by ample funding.
I guess that all three factors work simultaneously in the case of nuclear energy.
Corporate risk evaluations
And what are there lessons for corporate risk evaluation if we consider these and many other clever insights from Taleb?
The main one is that risk should not be understood as objective or rational but intersubjective. When a material breaks, we talk about failure; when people fail, we talk about the crisis. Both types of risks should be evaluated differently and managed differently. The first one is linear and, in most cases, reversible. The second one is non-linear and, in most instances, self-propagated, non-reversible.
To conclude: while wishful thinking never affects matter, it is (wishful) thinking and acting only that could, in principle, prevent or mitigate intersubjective risks like financial bubbles or global warming. Two utterly different risk evaluators should deal with one another. Most threatening corporate risks are found not in physical but in the intersubjective domain, not in probability but optionality.