Included in: Altman AI Safety List
Source Snippet
"OpenAI CEO cited this as his primary 'warning' text for AGI development."
View Primary Source"Probably the greatest threat to the continued existence of humanity."
— Sam Altman,

by Nick Bostrom
1
Notable Endorsers
2
Source Citations
Public figures who have publicly recommended or referenced this book, as documented from official sources.
Every entry is sourced from a publicly verifiable record. These are recommendations, not partnerships.
Included in: Altman AI Safety List
Source Snippet
"OpenAI CEO cited this as his primary 'warning' text for AGI development."
View Primary Source"Probably the greatest threat to the continued existence of humanity."
— Sam Altman,
Included in: Altman's 2025 Update
Source Snippet
"Re-endorsed in late 2025 during an OpenAI safety summit interview."
View Primary Source"Essential reading for anyone building the future of AGI."
— Sam Altman,