Doomers
The Atlantic’s piece on AI doomers gets closer to the real concern than most coverage, but it makes a subtle conflation that’s worth teasing apart.
The article frames doomer anxiety primarily as a governance problem. Companies are racing ahead with minimal oversight, a handful of executives are making civilization altering decisions without public input, and economic pressures make it nearly impossible to slow down even when safety concerns emerge. As Stuart Russell’s nuclear power plant analogy makes clear, deploying something this powerful without knowing how to make it safe is reckless regardless of your motives.
This framing is compelling because it’s obviously true. The regulatory vacuum is real. The competitive dynamics are real. The concentration of power in a few companies is real. And importantly, this framing makes the problem feel solvable through better institutions, democratic oversight, and political will.
But traditional doomers like Soares and Yudkowsky believe something stronger and more unsettling. Their position isn’t just that greedy or careless humans won’t implement good governance. It’s that even well-intentioned actors with perfect governance structures might fail to solve the technical challenge of keeping superintelligent AI aligned with human values. The alignment problem, in their view, is fundamentally hard in a way that makes institutional solutions insufficient.
The article does hint at this shift. Today’s doomers increasingly focus on current harms and governance failures rather than purely technical alignment challenges. And the piece explicitly concludes that the greatest reason to take doomers seriously isn’t the likelihood of all-powerful rogue algorithms, but rather the lack of public input and oversight.
That’s a more digestible concern. It fits our existing mental models about corporate power and regulatory capture. But it may also be missing the harder question. What if the technical problem is genuinely unsolvable at the speed we’re moving, no matter how good our institutions become?

