Risk does not lie in probability of event, but in optionality. Risk should not be evaluated on rationality, but optionality, meaning on heuristic calculation of gains and loses which are in complex environment always asymmetrical. Lessons given by Nassim Nicholas Taleb in his book Antifragile should be considered also for corporate risk evaluations as much as for all other cases that organizations and not individuals evaluate and prepare for deviations from normal.
Air transport safety case
All kinds of governmental and corporate policies should take in account lesson from this simple case. Disproportionally large sums of money are invested into air transport safety despite the fact that already at present casualties and material damages caused by plane crashes are negligible in comparison to car accidents, home violence or various types of home accidents.
There are couple of reasons for such disproportionality. First one is obvious. Causes for later type of accidents are easily attributed to individuals, while air transport accidents are attributed to organizations. As Taleb says: Never travel with a plane without a pilot. And further: never travel with a plane without a copilot. Corporate redundancy matters when individual is not in charge.
But there is also a second, more important reason behind such behavior. People decide not on the basis of truth, but on the basis of gains. Many small gains (faster, more comfortable, cheaper air travel) is annihilated by one large loss: death. Truth says that air transport is much safer than any other means nothing to us since we decide upon gain calculation. For that reason investment in air travel safety should be disproportionally large in comparison to rational benefits gained from such investments.
Implications for fake news theory:
Since fake denotes something that is not true, it is easy to see why we do not care if something is fake. We do care if we cannot gain from something. That explains why majority due to collective intellectual nudges rationally despise fake news, but at the same time behave in accordance to fake, if they evaluate that fake benefits them. Fake news is intellectual fallacy that makes media run by intellectuals happy as much as Thanksgiving turkey is extremely happy a day before execution.
Why air travel risk is not comparable to nuclear?
But then, it is not so easy to deal with gains. For instance: why all but only few accept traveling by planes while still only minority accept nuclear electricity? Both are comparable in causalities and material damage. Nuclear disasters caused negligible death toll in comparison to every other electricity production source, for the reason comparable to air transport safety. Disproportionally large sums of money and expertise was put into nuclear safety, perhaps even more than in air travel safety.
Asymmetry of gains
There is though one important difference between air transport and nuclear power plants risk evaluation. It lies in asymmetry distribution for both cases. While there is so far no comparable alternative to airplanes (clear benefits) there exist ample alternatives for electricity production. Comparative advantages of nuclear energy are replaceable by less competitive but at the same time less risky alternatives.
Please not that when risk evaluation is on stake, we cannot seriously consider some kind of objective risk but only perception of risk. Objective risk exists, but is of no value for humans, since they do not evaluate risk rationally, but on the basis of calculated benefits. Benefits cannot be but personal (subjective) so risk evaluation on personal level cannot be but heuristic, non-scientific subjective evaluation of gains and risks.
What about global warming risk?
But then some interesting heuristics comes out if we compare perception of nuclear disaster risk and global warming risk. If global warming riskwould be really high in perception of gains and losses, people should clearly go for nuclear solution, since alleged global extinction of civilizations due to irreversibility of global warming presents much higher risk and much more harmful losses than still localized possible nuclear accident.
My interpretation why we do not evaluate global warming as a huge risk goes into three directions:
A: We behave normally and do not care for possible truth about warming but evaluate our gains only. In our perception global warming is not a rare event (black swan) but something that will or will not come by definition and not by chance (like other catastrophes). So, we evaluate nuclear disaster as possible black swan while global warming represents either something not substantial or something like death, inevitable.
B: While intellectuals really believe that they can learn birds to fly, normal people with heuristic (evolutionary) knowledge know that such notion is not so false as stupid. But since they have gone through rationalistic education, they have to express their belief (rational) according to majority belief. They support fight against global warming in words (rationally) while they act as global warming presents no danger.
C: Since we do not produce energy personally and since by experience electricity is almost 100% available, we cannot feel involved in energy production. We thus feel no risk of supply. Since we know that majority of us cannot replace airplane pilot, we feel no risk any more after we accept initial promise of safety supported by ample funding.
My guess is that all three factors work simultaneously in the case of nuclear energy.
Corporate risk evaluations
And what are there lessons for corporate risk evaluation if we take in account these and many other clever insights from Taleb?
The main one is that risk should not be understood as something objective or rational, but intersubjective. When material breaks, we talk about failure, when people break, we talk about crisis. Both type of risks should be evaluated differently, and managed differently. First one is linear and in majority cases reversible. Second one is non-linear and in majority cases self-propagated, non-reversible.
To conclude: while wishful thinking never affects matter, it is (wishful) thinking and acting only that could in principle prevent or mitigate intersubjective risks like financial bubbles or global warmings. Two completely different risk evaluators should deal with one and another. Majority of or at least most threatening corporate risks are to be found not in physical, but in intersubjective domain, not in probability but in optionality.