top of page

Topic 6: AI, Biosecurity & Existential Risks

Objectives

This week we will be exploring some concrete sources of anthropogenic existential risk, defined as risks caused or influenced by humans. We will explore several examples of this, including Artificial Intelligence (AI), biological risks and nuclear war. These resources are all stand alone, so feel free to read them in any order! The estimated reading time is approximately 1 hour.  

Note that this week, you will discuss these readings in-depth at a group discussion with fellow mentees. 

Essential readings (expected reading time 1h)

This 80,000 Hours article (12 min) outlines the major arguments for why AI may be a pressing problem, using the scale, tractability and neglectedness framework. It also notes some key objections to working in this area.  

To learn more about another existential risk, read this separate 80,000 Hours article (25 min), which provides a comprehensive review of why mitigating global catastrophic biological risks (GCBRs) may be an important cause area. It outlines the plausibility of these risks, and looks at how the area may be relatively neglected despite large amounts of funding in biological risks in general.  

Finally, this 80,000 Hours problem profile (10 min) gives a brief outline of why nuclear security is important as a cause area and outlines some potential ways to reduce this risk. 

Optional readings

This podcast with Toby Ord (3 hours) is an in-depth look into existential risk. If you prefer reading to listening, you can find the transcript here. Many of the key points in his Precipice book are outlined; feel free to skip forward to any parts which are of particular interest to you. 

To get a sense of specific projects to reduce existential risk, read this 80,000 Hours post (10 min), which highlights over 50 policy and research ideas relating to risks like engineered pandemics, unaligned AI, climate, environmental damage, and nuclear weapons.  

An additional resource is this article by Brian Tomasik (20 min). It discusses how the rate of intellectual progress in several fields may be net positive or negative for the long-term future.  

Finally, this 80,000 Hours profile (15 min) covers climate change, with a focus on its existential risks. Like the other profiles, it touches on both reasons for and against working on this cause area specifically.  

bottom of page