The Index Investor Blog

Wuhan Coronavirus - Critical Uncertainties

Since the end of December, we have seen an expanding outbreak of a new variant of the coronavirus in China, spreading from its epicenter in Wuhan.

There are two critical uncertainties to resolve with more evidence: (1) the transmissibility of the Wuhan strain, which so far appears to be high, and (2) the pathogenicity (CFR), which at this point still appears to be relatively low. And when you hear an estimated CFR, always remember to check the denominator on which it is based (lab confirmed or just symptomatic cases).

When it comes to contagious viral diseases, there is usually a tradeoff between their transmissibility (how easily they spread) and their pathogenicity (how many people who become infected die). Viruses that quickly kill their infected hosts effectively limit their own spread.

The number of infected people who die is measured by the "Case Fatality Rate." However, this is a noisy estimate, because the denominator can be based on lab confirmed cases (which increases CFR) or just symptomatic cases (which lowers estimated CFR). Early estimates (based on very noisy reporting) have reported a preliminary CFR for the Wuhan strain of around 2%. However, this will likely change as more evidence becomes available.

To put Wuhan in perspective, the CFRs for Ebola and highly pathogenic H5N1 influenza are >60%. The 1918 pandemic flu was estimated at 10% to 20% (this strain was also relatively transmissible which is why it killed so many). The 2009 H1N1 "swine" flu CFR was estimated at 5% to 9%. By comparison, typical seasonal influenza has a CFR of one tenth of one percent or less (0.1%).

For other coronaviruses, SARS' CFR was estimated to be around 10%, while MERS' was 35%.

Transimissibility is measured using the “Basic Reproduction Number” (known as “R0” or “R-naught”), which is the number of people who will become infected by contact with one contagious person. If R is less than one (e.g., because of a high CFR), an epidemic will quickly “burn itself out”. In contrast, when R is greater than 1, a virus will spread exponentially.
Initial estimates of R for the Wuhan Novel Coronavirus are very noisy at this point. The World Health Organization has published a range of 1.4 to 2.5

For comparison, here are some historic estimated Basic Reproduction Numbers:
  • 1918 Spanish Flu = 2.3 to 3.4 (95% confidence interval)
  • SARS Coronavirus = 1.9
  • 1968 Flu = 1.80
  • 2009 Swine Flu = 1.46
  • Seasonal Influenza = 1.28
  • MERS Coronavirus = <1.0
  • Highly Pathogenic H5N1 Influenza = .90
  • Ebola = .70
The most concerning finding about the Wuhan Coronavirus are claims that it may be capable of infecting other people before a patient becomes symptomatic (i.e., shows signs that he/she has contracted the virus).

An article in the Lancet (“Nowcasting and Forecasting the Potential Domestic and International Spread of the 2019-nCoV Outbreak Originating in Wuhan, China”) found that, “Independent self-sustaining outbreaks in major cities globally could become inevitable because of substantial exportation of presymptomatic cases & the absence of large-scale public health interventions."

If it is supported by subsequent research, this initial finding will almost certainly lead to the imposition of more travel bans and quarantine measures in an attempt to limit transmission of the virus.

To end this post with a bit of good news, a very recent analysis has concluded that, because the number of new coronavirus cases in China is growing more slowly than the exponential rate implied by its Basic Reproduction Number, quarantine, travel bans, and "self-isolation" measures appear to be having a positive impact (https://arxiv.org/pdf/2002.07572.pdf).

Read More...

The Critical Importance of Anticipatory Intelligence in Our Complex, Uncertain World

The deceptive economic and geopolitical calm of the past decade has been an aberration, brought about by unprecedented global monetary stimulus to hold at bay the deflationary forces that have been building in the global economy. Thanks to central bankers’ efforts, volatility has remained low, and organizations have not had to worry too much about disruptive risks beyond those posed by rapid technological change. That is about to change: Brexit, the election of Donald Trump, the emergence of a new US-China Cold War, and nearly two trillion dollars of sovereign bonds bearing negative interest rates are early indications that we are entering a period of much higher uncertainty.

With this change will come much greater organizational focus on developing the processes, methods, tools, and skills needed to survive and thrive in a much more dangerous environment. Josh Kerbel, a faculty member at the United States’ National Intelligence University, recently published an article that we hope will have a substantial impact on these efforts, and closely reflects our views at Britten Coyne Partners.

In “Coming to Terms with Anticipatory Intelligence”, Kerbel notes that it is “a relatively new type of intelligence that is distinct from the “strategic intelligence” that the intelligence community has traditionally focused on. It was born from recognition that the spiking global complexity (interconnectivity and interdependence, both virtual and physical) that characterizes the post–Cold War security environment, with its proclivity to generate emergent (non-additive or nonlinear) phenomena, is essentially new. And as such, it demands new approaches.”

“More precisely, this new strategic environment means that it is no longer enough for the intelligence community to just do traditional strategic intelligence: locking onto, drilling down on, and — less frequently — forecasting the future of issues once they’ve emerged. While still important, such an approach will increasingly be too late. Rather, the intelligence community should also learn to practice foresight (which is not the same as forecasting) and imagine or envision possibilities before they emerge. In other words, it should learn to anticipate.”

Kerbel echoes longstanding concerns among some members of the intelligence community. For example, a 1983 CIA analysis of failed intelligence estimates noted that, “each involved historical discontinuity, and, in the early stages...unlikely outcomes. The basic problem was...situations in which trend continuity and precedent were of marginal, if not counterproductive value."

This distinction was also brought home to me during the four years I spend on the Good Judgement Project, which demonstrated that forecasting skills could be significantly improved through the use of a mix of techniques. But hiding in the background was an equally important question: What was the source of the questions whose outcome we were forecasting? One of my key takeaways was that anticipatory thinking – posing the right questions – was just as important to successful policy and action as accurately forecasting their outcome.

Kerbel notes that, “as clear and compelling as the case for anticipatory intelligence is, it remains poorly understood… Since the 1990s, increasing complexity has been an issue that many in the intelligence community have impulsively dismissed or discounted. Their refrain echoes: “But the world has always been complex.” That’s true. However, what they fail to understand is that the closed and discrete character of the Soviet Union and the bipolar nature of the Cold War — the intelligence community’s formative experience — eclipsed much of the world’s complexity and effectively rendered America’s strategic challenge merely complicated (no, they’re not the same). Consequently, the intelligence community’s prevailing habits, processes, mindsets, etc. — as exemplified in the traditional practice of strategic intelligence — are simply incompatible with the challenges posed by the exponentially more complex post-Cold War strategic environment.”

Kerbel’s view is that “Fundamentally, anticipatory intelligence is about the anticipation of emergence… Truly emergent issues are fundamentally new — nonlinear — behaviors that result unpredictably but not unforeseeably from micro-behaviors in highly complex (interconnected and interdependent) systems, such as the post–Cold War strategic environment. Although emergence can seemingly happen quite quickly (hence the need to anticipate), the conditions enabling it are often building for some time — just waiting for the “spark.” It is these conditions and what they are potentially “ripe” for — not the spark — that anticipatory intelligence should seek to understand… Foresight involves imagining how a broad set of possible conditions (trends, actors, developments, behaviors, etc.) might interact and generate emergent outcomes.”

This begs the question of which foresight methods and tools are most effective. We go into great detail about this in our Strategic Risk Governance and Management course. In this blog post we’ll highlight four key insights.

Traditional scenario methodologies often disappoint

As a general rule, when reasoning from the present to the future, we naturally (to maintain our sense psychological safety) minimize the extent of change that could occur.

In complex systems, it is almost always impossible to reduce the forces that could produce non-linear change to just two critical uncertainties, as is done in the familiar “2 x 2” scenario method. And in some cases, the uncertainties that most worry an organization’s senior leaders are either out of bounds for the scenario construction team, or the range of their possible outcomes is deliberately constrained.

I first studied the scenario methodology under Shell’s Pierre Wack back in 1983. In its early applications, this approach was often able to fulfill its goal of changing senior leaders’ perceptions. Over the years, however, I have seen what I call “scenario archetypes” become more common, which has weakened their ability to surprise leaders and change their perceptions. These archetypes result from one critical uncertainty being technological in nature, and the other being one whose negative outcome would be very bad indeed. This gives rise to three archetypes: (1) Business pretty much as usual, with current trends linearly extrapolated (this is usually the scenario that explicitly or implicitly underlies the organization’s strategy); (2) The World Goes To Hell (slow technology change and the negative outcome for the other uncertainty); and (3) Technology Saves the Day (fast technology change overcomes the negative outcome of the other uncertainty). This leaves what is usually the least well defined but potentially most important scenario, where technology rapidly develops, but the other uncertainty does not have the negative outcome. Too many organizations fail to fully explore the implications of this scenario, usually because they are more realistically threatening to the current strategy.

Historical analogies are limited by our knowledge of history

Whether the subject is political economic, technological, business, or military history, most of us have studied too little of it to have a rich based of historical analogies from which we can draw while trying to anticipate the future.

Consider some of the challenge we face at the present, including the transition from an industrial to an information and knowledge-based economy; the rapid improvement in potential “general purpose” technologies like automation and artificial intelligence; and the potential transition of the global political economy from a period of growing disorder and conflict to period of more ordered conflict due to a new Cold War between the US and China. In all these cases, the most relevant historical analogies may lie further in the past than many people realize.
Prospective hindsight – reasoning from the future to the present – is surprisingly effective

Research has shown that when we are given future event, told that it is true, and asked to explain how it happened, our causal reasoning is much more detailed than if we are simply asked, in the present, how this future event might happen.

However, that still leaves the “creative” or “imaginative” challenge of conceiving of these potential future events. We have found that starting with broad future outcomes – e.g., our company has failed; China has successfully forced the US from East Asia – generates a richer set of alternative narratives than a narrower focus on specific future events.

Explicitly focusing on system interactions helps identify emergent effects and early warning indicators

Quantitatively, agent-based models, which enable complex interactions between different types of agents, can produce surprising emergent effects, and, critically, help you to understand why they occur (which can aid in either their prediction or in designing interventions to promote or avoid them).

Qualitatively, we have found it very useful to create traditional scenarios in narrower policy areas (e.g., technology, the economy, national security, etc.) and then explicitly trace and assess overall system dynamics and how different scenario outcomes could interact across time and across policy areas (e.g., technology change often precedes economic and national security change) to produce varying emergent effects.

Kerbel concludes by noting that, “Exponentially increasing global complexity is the defining characteristic of the age.” Because of this, effective anticipatory intelligence capabilities are more important than ever before to organizations’ future survival and success – and more challenging to develop.
Read More...
<<  Page 2 / 2