Here’s the funding gap that gets me the most emotionally worked up:
In 2020, the largest philanthropic funder of nuclear security, the MacArthur Foundation, withdrew from the field, reducing total annual funding from $50m to $30m.
That means people who’ve spent decades building experience in the field will no longer be able to find jobs.
And $30m a year for nuclear security philanthropy is miniscule. Other common causes like education, human services, and health all receive tens of billions of dollars per year in philanthropic funding, about 1000 times as much).
Even very neglected causes like factory farming, catastrophic biorisks and AI safety these days receive hundreds of millions of dollars of philanthropic funding, so at least on this dimension, nuclear security is even more neglected.1
In fact, the budget of Oppenheimer was $100m, so a single movie cost more than 3x annual funding to non-profit policy efforts to reduce nuclear war.
And this is happening exactly as nuclear risk seems to be increasing. There are credible reports that Russia considered the use of nuclear weapons against Ukraine in autumn 2022. China is on track to triple its arsenal. North Korea has at least 30 nuclear weapons.
More broadly, we appear to be entering an era of more great power conflict and potentially rapid destabilising technological change, including through advanced AI and biotechnology.
The Future Fund was going to fill this gap with ~$10m per year. Longview Philanthropy hired an experienced grantmaker in the field, Carl Robichaud, as well as Matthew Gentzel. The team was all ready to get started.
But the collapse of FTX meant that didn’t materialise.
Moreover, Open Philanthropy decided to raise their funding bar, and focus on AI safety and biosecurity, so it hasn’t stepped in to fill it either.
Longview’s program was left with only around $500k to allocate on Nuclear Weapons Policy in 2023, and has under $1m on hand now.
Giving Carl and Matthew more like $3 million (or more) a year seems like a pretty great thing to do.
This would allow them to pick the low hanging fruit among opportunities abandoned by MacArthur – as well as look for new opportunities, including those that might have been neglected by the field to date.
I agree it’s unclear how tractable policy efforts are here, and I haven’t looked into specific grants, but it still seems better to me to have a flourishing field of nuclear policy than not. I’d suggest talking to Carl about the grants they see at the margin (carl@longview.org).
I’m also not sure, given my worldview, that this is even more effective than funding AI safety or biosecurity, so I don’t think Open Philanthropy is obviously making a mistake by not funding it.
I’m just saying that from what I can see, it seems pretty great as far as opportunities go. (Emotionally, it also gets me fired up that society isn’t doing more about the fact that all our cities could blow up at any minute...) I’d expect it to be most attractive to someone who’s more sceptical about AI safety, but agrees the world underrates catastrophic risks. Or to people who want to focus on something more neglected than AI safety. If that might be you, it seems well worth looking into.
If you’re interested, you can donate to Carl and Matthew’s fund here:
If you have questions or are considering a larger grant, reach out to: carl@longview.org
To learn more, you might also enjoy 80,000 Hours’ recent podcast with Christian Ruhl (who also does grantmaking in this area at Founder’s Pledge)
Note here I’m only comparing philanthropic funding. A full assessment of neglectedness would also account for other forms of funding and resources (quality-adjusted), especially from the government. Prevention of nuclear war receives significant attention from the defence community, so depending on how you want to count those resources, is plausibly somewhat less neglected than AI safety (and maybe similar to GCBRs). However, it would still be dramatically more neglected than things like human services or education. Moreover, there are some kinds of opportunities that can only be taken by philanthropic funders, so having very little philanthropic funding is still suggestive of having important gaps. We don’t want to rely entirely on the government to handle this issue.
Thank you for this post - from the nuclear security guy, who used to work on many MacArthur supported projects.
See discussion of this post on the EA Forum:
https://forum.effectivealtruism.org/posts/mEkJjdpg8sespKwNZ/nuclear-security-seems-like-an-interesting-funding-gap