How governments can better understand global catastrophic risk

The Policy Context

Since the mid-twentieth century, global trends in technology, politics, demographics and environmental impact have resulted in unprecedented levels of risk for human society. 

A set of risks now threaten human security, prosperity and potential to an extent never before seen in human history. These global catastrophic risks have the potential to inflict significant damage to human wellbeing on a global scale. In the most extreme case, the entire species could be at threat from extinction or permanent collapse.

The human species has always faced the risk of existential or global catastrophe from natural hazards, such as supervolcanoes and asteroids. More recently, anthropogenic or human-driven threats to humanity and human civilisation have emerged and probably become a greater risk. These global catastrophic risks include advanced artificial intelligence, extreme climate change, nuclear winter and engineered pandemics.

The potential for harm posed by these risks means that national governments have a responsibility to their citizens to proactively implement policy to prevent, prepare and respond to these risks. But the first step in any risk management process is to understand the risks. 

Governments around the world are beginning to turn greater attention to these risks. For example, the US National Intelligence Council highlighted these risks in their most recent Global Trends Report, the US Intelligence Community’s flagship report for the incoming Presidential administration every four years:

“Technological advances may increase the number of existential threats; threats that could damage life on a global scale challenge our ability to imagine and comprehend their potential scope and scale, and they require the development of resilient strategies to survive. Technology plays a role in both generating these existential risks and in mitigating them. [Human-induced] risks include runaway AI, engineered pandemics, nanotechnology weapons, or nuclear war.”

And the Secretary General of the United Nations also recognised global catastrophic risks in his 2021 ‘Our Common Agenda’: 

“These risks are now increasingly global and have greater potential impact. Some are even existential: with the dawn of the nuclear age, humanity acquired the power to bring about its own extinction. Continued technological advances, accelerating climate change and the rise in zoonotic diseases mean the likelihood of extreme, global catastrophic or even existential risks is present on multiple, interrelated fronts. Being prepared to prevent and respond to these risks is an essential counterpoint to better managing the global commons and global public goods.”

But more work is needed for governments to better understand these risks and turning that understanding into policy action. 

The policy vision

Governments must ensure that they sufficiently understand global catastrophic risk in order to design prevention, preparation and response measures. National governments should have a strong ability to identify, analyse and monitor the risks. They must also have a strong understanding of the government’s and nation’s contribution to global catastrophic risk.

Global catastrophic risk should be considered as a set, enabling governments to allocate resources depending on how they prioritise the risks. And lessons and knowledge about one risk could be transferable to others. It would also help ensure policy responses for one risk do not exacerbate other risks. 

The policy problem

National governments often struggle with understanding extreme risks, and global catastrophic risks specifically. Three primary reasons drive a poor understanding of this issue.

First, the nature of global catastrophic risk as an issue set makes it difficult to understand and analyse. The scale of the risk is unprecedented in human history. Global catastrophic risks impact human civilisation and its future, and, at worse, threaten human extinction. There is great uncertainty regarding how these risks unfold, how likely the scenarios are, and when the risks could occur. And many of the risks are novel and only now emerging on the horizon. Although risks around nuclear winter and climate change, for example, have been known for decades, catastrophic technology-based threats, such as artificial intelligence and engineered pandemics, are yet to be fully realised. 

Second, governments, like people, can find it hard to think creatively about the future. Bureaucratic structures are set up for existing problems, and foresight capability is small and nascent. To the extent that analysis and imagination of the future is conducted, governments find it tough to insert the findings into strategic policy. And futures analysis can be misguided if conducted by those that suffer from groupthink or myopia.

Third, scientific and technical expertise for extreme risks, including global catastrophic risks, is often lacking or inconsistent. Aside from defence and civilian research agencies, deep subject matter expertise, particularly on technology issues, tends to reside outside the public sector. This expertise is crucial when improving the understanding of political leaders and senior officials who develop the policies. And engagement with the science community can often be ad-hoc or poorly managed.

The policy options

Governments must take measures to better understand global catastrophic risk and implement structures and processes that enable decision-makers to be better informed about the risk. A better understanding of global catastrophic risk includes: the set of threats and hazards; the vulnerabilities to the threats and hazards; pathways and scenarios for different risks; the drivers and factors that create and exacerbate risk; and implications on society, economy, security, environment and other policy priorities.

The following actions enable governments to quickly and cheaply improve their understanding of the global catastrophic risk their countries face:

  • Commission an independent review of extreme risks, similar to the UK’s Blackett Review of High Impact Low Probability Risks in 2012
  • Map each of the global catastrophic risks against impact on critical infrastructure systems to find gaps and vulnerabilities
  • Conduct a review of the government’s horizon-scanning capability, similar to the UK’s review of cross-government horizon scanning in 2013, and develop a report on major trends identified in existing horizon-scanning products relevant for global catastrophic risk
  • Allocate technology experts within the intelligence and defence community to conduct ongoing analysis of extreme technological threats, such as engineered pandemics, runaway artificial intelligence and highly advanced autonomous weapons
  • Conduct a review of current allocation of resources and research efforts to global catastrophic risk research across civilian and defence science agencies
  • Develop a shared list of policy and research questions between the policy and academic communities on extreme and global catastrophic risk, similar to the “80 questions for UK biological security

Beyond these efforts, governments must take strategic policy action to improve their understanding of global catastrophic risks across four areas: 

  • Risk assessment: identify and analyse extreme and global catastrophic risks holistically to sufficiently inform policy decisions to manage them 
  • Futures analysis: improve practice and use of futures analysis, including horizon-scanning, forecasting and foresight activities, to alert policy-makers to emerging risks and challenges and to facilitate better long-term policy
  • Intelligence and warning: improve intelligence and warnings capability on extreme and global catastrophic risks to inform governments on trends, events and risks in the global landscape.
  • Science and research: Increase government’s science and research capability on global catastrophic risk so that policy solutions are supported by cutting-edge technical expertise

The following sections provides specific policy actions for each of these areas.