The 9/11 Memorial Museum in New York City is a moving tribute to a tragic episode in the city's history. It's also incredibly extensive. I planned to spend two hours there, and ended up spending four. Part of that was my own interest in the event, having lived in New York for many years (albeit well after 9/11). But another reason for the length of my visit was the surprising nuance and richness of the exhibits. I was pleased to see that the museum didn't shy away from presenting uncomfortable issue that resulted from 9/11, such as a series of expensive wars of debatable efficacy, the rise of anti-Muslim sentiment globally and in the US, as well as threats posed to individual civil liberties. It was also another reminder of the concepts of reflexivity and the risk society. I refer here not to the concept popularized by George Soros, but rather its use by sociologists Ulrich Beck and Anthony Giddens. To simplify greatly, Beck and Giddens described our modern world as being a risk society, in which (a) we devise methods of dealing with the hazards and insecurities created by modernization itself, and (b) respond in turn to risks created by those supposed safeguards. The idea of a feedback loop has obvious parallels to the Soros concept, since both iterations are founded on ideas developed by William Thomas, Robert K. Merton and Karl Popper.
9/11 clearly demonstrates the risk society at work. 19 men were able to turn the fruits of modernization (airplanes and skyscrapers) into weapons of mass murder and terror. In response, the US and its allies launched a global "war on terror" that at the least inconveniences us as air travellers, and in its worst form seems to have resulted in the torture of suspected terrorists, giving fuel to anti-Western sentiment.
This certainly poses problems for those attempting to be thoughtful citizens. While we may hope to be rational and well-informed about political issues, it is incredibly difficult to do an informal mental cost-benefit analysis. Indeed, one of the best guides to public policy is still Bastiat's urging to consider "that which is seen, and that which is not seen". Take national security for example. Its very nature prevents citizens from ever learning the full extent of threats or the cost of steps taken to deal with those threats. So how can citizens assess whether increased safety has merited the extraordinary costs (as the Cato Institute asks here)? These costs refer not only to the financial outlay combating terrorism since 9/11, but also the possibility that terrorism is better allayed by supporting poverty and disenfranchisement across the globe.
Climate change offers another example of the challenge of being a citizen in today's risk societies. It seems to me that reasonable people can argue as to the urgency of the climate crisis, and how much should be done to prevent further man-made climate change (for example, see Matt Ridley on being a "climate lukewarmer").
Nassim Taleb suggests the use of the "precautionary principle" in such matters. Essentially, he and others argue that if an action or policy has a suspected risk of causing harm to the public or to the environment, in the absence of scientific consensus that the action or policy is not harmful, the burden of proof that it is not harmful falls on those taking the action. I'm not quite sold on this. He raises the question of drugs such as Vioxx, which reportedly caused 88,000 heart attacks and 38,000 deaths in the US. While Vioxx is a tragic and horrific story, the "seen and unseen" dictum creates complexity, in that there are presumably many people who deteriorate or die from drugs that are held up from approval.
Let me move from national security issues to security issuances (geddit?). Taleb is certainly known better for his writing on financial and economic risk, and I quoted him in an earlier post on risk. A recognition of black swan events leads to an emphasis on robustness (or even anti-fragility). Last week brought a reminder of the importance of robustness: The Swiss central bank decided to remove its de facto peg to the Euro, causing several hedge funds to shutter after suffering heavy losses. Retail FX investors similarly learned the pitfalls of 50:1 leverage when a currency moves 28% in a day.
I don't want to make this post longer than it already is, so let me end with some brief comments on how to increase robustness in one's financial portfolio
(1) Recognize sources of leverage. Leverage introduces fragility into one's financial position. Besides explicit leverage (that is, the borrowing of money to finance an asset), there is implicit leverage in businesses through high fixed or total production costs (operating leverage).
(2) Be reasonable about one's knowledge. A long familiarity with a name or industry should decrease (though never eliminate) the unknown unknowns over time. Again, this lends itself to sticking within one's circle of competence.
(3) Think about asset correlation, but realize these correlations can change in a globalized, financialized world.
(4) Size positions accordingly, in line with points 1-3.
(5) Be committed to learning appropriately from mistakes. Learning not to repeat mistakes is an example of Taleb's anti-fragility, since the individual or organization grows stronger from small losses. But we can sometimes take away too much from mistakes, as Ed Catmull learned when dissecting Pixar's missteps.
None of this requires fancy quantitative methods, as I argued in my earlier post. It does require common sense and humility to recognize that the world is viciously unpredictable. There's no use lamenting this fact. But we can take steps to prevent ourselves - or at least our portfolios - from being buffeted by our world of risk.