Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

If Anyone Builds It, Everyone Dies

From Wikipedia, the free encyclopedia
2025 book about artificial intelligence

If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
UK edition cover (2025)
AuthorEliezer Yudkowsky andNate Soares
SubjectExistential risk from artificial intelligence
PublisherLittle, Brown and Company
Publication date
16 September 2025
Pages256
ISBN9780316595643

If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (published in the UK with the alternative subtitleThe Case Against Superintelligent AI) is a 2025 book byEliezer Yudkowsky and Nate Soares which detailspotential threats posed to humanity byartificial superintelligence.

The book was published byLittle, Brown and Company, and appeared inThe New York Times Best Seller list on October 5, 2025.[1]

Synopsis

[edit]

Modern AI systems are "grown" rather than "crafted": unlike traditional software that consists of code created by humans, modern AI systems are hundreds of billions to trillions of numerical parameters calledweights whose functions areopaque to researchers. These numbers can be found using enormous computing power, but humans do not truly understand how they work and can neither specify nor control their values. When an AI system threatens aNew York Times reporter, or calls itself "MechaHitler", no one can look inside, find the line of code responsible for that behavior, and fix it.[2]

Humans can train AI systems to be generally competent. An AI that tries to achieve goals will perform better on many metrics, so it will be selected for by the training process. However, due to the nature of modern machine learning, it is not possible for humans tospecify the goals that a superintelligent AI system should try to pursue. With current technology, the AI's goals would not contain anything of value to humans.[3]

Just as humans would lose a game of chess againstStockfish, they would lose against an AI system that is generally more competent than they are. It is hard to predict the exact path, as that would mean being as good at achieving goals as the AI system, but there are some paths available to it. Superintelligence would not care about humans, but it would want the resources that humans need. Humanity would thus lose andgo extinct.[4]

The book's authors contend that the world's leaders, the scientific community, and everyone else should speak up and warn the world about the danger. To avoid a catastrophe, the authors believe that humanity needs to coordinate to halt large-scale general AI development everywhere, possibly with an exception for narrow AI systems likeAlphaFold that would not threaten humanity's existence. At a minimum, as the first step, humanity should make a global halt into AI research, as they get more evidence of the danger.

Reception

[edit]

Reviews by public figures and scientists

[edit]

Max Tegmark acclaimed it as "The most important book of the decade", writing that "the competition to build smarter-than-human machines isn't an arms race but a suicide race, fueled by wishful thinking."[5]

It also received praise fromStephen Fry,[6]Ben Bernanke,Vitalik Buterin,Grimes,Yoshua Bengio,Scott Aaronson,Bruce Schneier,George Church,Tim Urban,Matthew Yglesias,Christopher Clark,Dorothy Sue Cobble,Huw Price,Fiona Hill,Steve Bannon,Emma Sky,Jon Wolfsthal,Joan Feigenbaum,Patton Oswalt,Mark Ruffalo,Alex Winter,Bart Selman,Liv Boeree,Zvi Mowshowitz,Jaan Tallinn, andEmmett Shear.[7][8][9][10]

Critical reception

[edit]

Reviews of the book by critics have been mixed.

Upon its release, it was included inthe New York Times best-seller lists for hardcover nonfiction and for combined print and e-books nonfiction.[11]

Writing forThe New York Times,Stephen Marche compared the book to that of a Scientology manual and said reading it was like being trapped in a room with irritating college students on their firstmushroom trip.[12]

The New Yorker included it in the list ofThe Best Books of the Year So Far.[13]

The Guardian called it one of the biggest books of the autumn and the September 22, 2025 Book of the day, stating that superintelligent AI is dangerous, but humanity can still take steps to avoid disaster.[14] In a review,The Guardian's non-fiction books editor David Shariatmadari wrote that "If Anyone Builds It, Everyone Dies is as clear as its conclusions are hard to swallow" and that anyone who cares about the future has a responsibility to read the book's arguments.[4]

Tom Whipple, the Science editor atThe Times, described the book as both compelling and disturbing, noting its readability and engaging storytelling, which at times resembled a thriller, "albeit one where the thrills come from the obliteration of literally everything of value". While finding the authors' astonishing and dire claims credible, he expressed hope that they are wrong due to the apocalyptic outcome, one that he himself can't see a way to avoid.[15]

Bill Conerly wrote forForbes that the book persuaded him that thecatastrophic risk to humanity was greater than he had previously thought. He noted that the book effectively used parables to argue that AI's self-improvement could lead to unpredictable evolutionary paths which may diverge from those of its human programmers and eventually cause the AI to view humans as a hindrance or simply not valuable enough to warrant resources. He concluded that he was "far more concerned" after reading the book.[16]

Kevin Canfield wrote in theSan Francisco Chronicle that the book makes powerful arguments and recommended it.[17]

InThe Atlantic,Adam Becker wrote that the book is "tendentious and rambling, simultaneously condescending and shallow. Yudkowsky and Soares are earnest; unlike many of the loudest prognosticators around AI, they are not grifters. They are just wrong...Yudkowsky and Soares fail to make an evidence-based scientific case for their claims."[18]

In an article titled"Why we must pull the plug on superintelligence",Paul Wood wrote forThe Spectator: "If more and more people understand the danger, wake up and decide to end the 'suicide race', our fate is still in our own hands. If Anyone Builds It, Everyone Dies is an important book. We should consider its arguments – while we still can."[19]

Publishers Weekly said the book is an "urgent clarion call to prevent the creation of artificial superintelligence" and a "frightening warning that deserves to be reckoned with", but mentioned that some of the parables and analogies are less effective than others and that very few opposing viewpoints are presented.[20]

Kirkus Reviews gave a positive review, calling the book "a timely and terrifying education on the galloping havoc AI could unleash—unless we grasp the reins and take control."[21]

Booklist gave the book a starred review. It praised Yudkowsky and Soares for their analysis of the existential threats posed by artificial superintelligence, detailing a potentiality that is less about technological advancement and more the survival of humanity. It likens the book to a "fire alarm" for anyone involved in shaping the future, emphasizing that it demands serious consideration and reflection from all stakeholders no matter their opinion on its conclusions.[22]

Steven Levy inWired expressed skepticism regarding the likelihood of AI causing human extinction, finding the authors' proposed solutions for preventing devastation more improbable than their doomsday scenarios, but mentioned a study of AI contemplating blackmail and concluded "My gut tells me the scenarios Yudkowsky and Soares spin are too bizarre to be true. But I can’t besure they are wrong."[23]

The New Zealand Herald called it the book of the day on October 21, 2025: "How many chances do you want to take with the future of our species?".[24]

Ian Leslie, writing forThe Observer, said the authors wrote their story with "clarity, verve, and barely suppressed glee," making it "a lot of fun" for a book about human extinction. However, he was not convinced that superintelligence as described is imminent, or that if it emerges, it would likely lead to humanity's demise.[25]

Gary Marcus inThe Times Literary Supplement wrote that "Things are worrying, but not nearly as worrying as the authors suggest" and that the authors "lay out this thesis thoughtfully, entertainingly, earnestly, provocatively and doggedly. Yet their book is also deeply flawed. It deserves to be read with an immense amount of salt."[26]

Jacob Aron, writing forNew Scientist, called the book "extremely readable" but added that "the problem is that, while compelling, the argument is fatally flawed", concluding that effort would be better spent on "problems of science fact" like climate change.[3]

Clara Collier criticized the book in theeffective altruist journalAsterisk Magazine for being less coherent than the authors' prior writings and not fully explaining the premises.[27]

Grace Byron in theWashington Post criticized the book for being a polemic with vague instructions rather than a manual. She concluded that while the authors are subject matter experts, the book feels like it was written by "two aggrieved patriarchs tired of being ignored".[28]

Gareth Watkins in theNew Statesman wrote that "If Anyone Builds It, Everyone Dies is not a serious book" and that "Yudkowsky’s legacy has not been to save the world, but to make it cheaper, sillier, and more Online".[29]

See also

[edit]

References

[edit]
  1. ^If Anyone Builds It, Everyone Dies. Hachette Book Group. 3 March 2025.ISBN 978-0-316-59564-3.
  2. ^Louallen, Doc."New book claims superintelligent AI development is racing toward global catastrophe".ABC News. Retrieved2025-11-02.
  3. ^abAron, Jacob (8 September 2025)."No, AI isn't going to kill us all, despite what this new book says".New Scientist. Retrieved9 September 2025.
  4. ^abShariatmadari, David (22 September 2025)."If Anyone Builds it, Everyone Dies review – how AI could kill us all".The Guardian. Retrieved22 September 2025.
  5. ^"Max Tegmark on X: "Most important book of the decade:"".Twitter.
  6. ^If Anyone Builds It, Everyone Dies. 18 September 2025.
  7. ^"vitalik.eth (@VitalikButerin) on X: "A good book, worth reading to understand the basic case for why many people, even those who are generally very enthusiastic about speeding up technological progress, consider superintelligent AI uniquely risky"".Twitter.
  8. ^"If Anyone Builds It, Everyone Dies: Full Praise".
  9. ^""If Anyone Builds It, Everyone Dies" release day! - Machine Intelligence Research Institute". 16 September 2025.
  10. ^"If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All by Eliezer Yudkowsky, Nate Soares, Hardcover".
  11. ^"Combined Print & E-Book Nonfiction".The New York Times. RetrievedSeptember 25, 2025.
  12. ^Marche, Stephen (27 August 2025)."A.I. Bots or Us: Who Will End Humanity First?".The New York Times. Retrieved9 September 2025.
  13. ^"The Best Books of the Year So Far".The New Yorker. 22 October 2025. Retrieved27 October 2025.
  14. ^"From a new Thomas Pynchon novel to a memoir by Margaret Atwood: the biggest books of the autumn".The Guardian. 6 September 2025. Retrieved12 September 2025.
  15. ^Whipple, Tom (13 September 2025)."AI — it's going to kill us all".The Times. Retrieved13 September 2025.
  16. ^Conerly, Bill (14 October 2025)."Superintelligent AI Threatens Humanity: A Game-Changing Analysis".Forbes. Retrieved16 October 2025.
  17. ^Canfield, Kevin (2 September 2025)."'Everyone, everywhere on Earth, will die': Why 2 new books on AI foretell doom".San Francisco Chronicle. Retrieved11 September 2025.
  18. ^Becker, Adam (19 September 2025)."The Useful Idiots of AI Doomsaying".The Atlantic. Retrieved19 September 2025.
  19. ^Wood, Paul (23 September 2025)."Why we must pull the plug on superintelligence".The Spectator. Retrieved23 September 2025.
  20. ^"If Anyone Builds It, Everyone Dies: Why Superhuman AI Will Kill Us All".Publishers Weekly. Retrieved11 September 2025.
  21. ^"IF ANYONE BUILDS IT, EVERYONE DIES".Kirkus Reviews. Retrieved9 September 2025.
  22. ^"IF ANYONE BUILDS IT, EVERYONE DIES".Booklist. 2 September 2025. Retrieved23 September 2025.
  23. ^Levy, Steven (5 September 2025)."The Doomers Who Insist AI Will Kill Us All".Wired. Retrieved9 September 2025.
  24. ^McLauchlan, Danyl (21 October 2025)."Book of the Day: If Anyone Builds It, Everyone Dies: The case against superintelligent AI".The New Zealand Herald. Retrieved22 October 2025.
  25. ^Leslie, Ian (28 September 2025)."Visions of the AI apocalypse".The Observer. Retrieved29 September 2025.
  26. ^"AI armageddon?".
  27. ^Collier, Clara (September 2025)."More Was Possible: A Review of If Anyone Builds It, Everyone Dies". Asterisk Magazine. Retrieved24 September 2025.It's true that the book is more up-to-date and accessible than the authors' vast corpus of prior writings, not to mention marginally less condescending. Unfortunately, it is also significantly less coherent. The book is full of examples that don't quite make sense and premises that aren't fully explained. But its biggest weakness was described many years ago by a young blogger named Eliezer Yudkowsky: both authors are persistently unable to update their priors.
  28. ^Byron, Grace (28 September 2025)."Could AI be a truly apocalyptic threat? These writers think so".The Washington Post. Retrieved29 September 2025.
  29. ^Watkins, Gareth (28 October 2025)."The guru of the AI apocalypse".New Statesman. Retrieved31 October 2025.

External links

[edit]
Wikiquote has quotations related toIf Anyone Builds It, Everyone Dies.
Concepts
Organizations
People
Other
Retrieved from "https://en.wikipedia.org/w/index.php?title=If_Anyone_Builds_It,_Everyone_Dies&oldid=1324270700"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp