Oppenheimer vs Altman
The rain in Los Alamos did not smell like rain. It smelled like wet wool and burning tobacco and sulfur. The mud sucked at the boots of men who looked like ghosts. They were not sleeping. They were staring at blackboards covered in chalk dust and terror.
The government drove them into that desert because the alternative was unthinkable. The newsreels from Europe showed twisted metal and burning cities. The Nazis were not a market competitor. They were an existential void threatening to consume democracy and light itself. General Groves did not care about quarterly earnings. Oppenheimer was not worried about user retention. They were building a monster to kill a monster. They terrified themselves because they knew the math. They knew the atmosphere might ignite. They pushed the button anyway because the world was already on fire.
Now look at the glass towers of San Francisco.
The sun is shining. The air conditioning hums at a perfect constant. The cafeterias serve kombucha and sushi. There is no mud. There are no Nazis marching across Europe. The men building the new bomb are wearing hoodies and vests. They are not racing to save civilization. They are racing to beat a search engine.
We are trying to map the danger of this moment and we are using the wrong chart. The Atlantic recently peeled back the layers of our anxiety. It pointed out a subtle trap in how we think about the end of the world. We frame the AI crisis as a governance problem. We look at the regulatory vacuum and the concentration of power and the reckless speed of corporations. This framing is compelling because it feels solvable. It implies that if we just had better laws or wiser institutions or a democratic vote then we could fix it.
But the true horror is much colder.
The deepest fear of the original doomers was not about bad governance. It was about the code itself. They argue that the alignment problem is fundamentally hard in a way that makes institutional solutions irrelevant. We are not building a car with bad brakes that needs a better driver. We are summoning a storm. Even well intentioned actors with perfect governance structures might fail to solve the technical challenge of keeping a superintelligence aligned with human values. We are building a mind we do not understand and assuming it will obey us.
We focus on the governance failure because it is a digestible concern. It fits our mental models about corporate power. It is easier to worry about a lack of public oversight than it is to worry about a mathematical impossibility.
So we tell ourselves a bedtime story. We tell ourselves we can build a box.
We imagine a digital prison with air gapped walls where we can study the monster safely. We cling to the idea that we can pull the plug. We assume that if we just get the politics right then we can contain the physics.
But we are not getting the politics right.
The American government under President Trump has looked at the box and decided it is a cage for our own ambition. He looks at the map of the world and he does not see a risk of extinction. He sees a leaderboard. He sees Beijing pouring billions into silicon and state sponsored code. He sees a future where the operating system of the planet is written in Mandarin.
The logic in Washington is brutal and simple. Safety is a shackle. Regulation is surrender. If we pause to build a containment wall then China will sprint past us. If we worry about whether the AI is aligned with human values then we lose the market (and all those short term corporate profits David Sacks and Marc Andreessen could miss out on…).
So the order comes down from the top. Tear up the safety manuals. Fire the ethics boards. Unleash the corporations to solve the problem of American dominance. The narrative has shifted. Restraint is no longer wisdom. Restraint is weakness. The only thing that matters is speed.
We are reconciling the danger of the bomb with the danger of the code. The bomb threatened to vaporize our cities. The code threatens to obsolete us entirely. It targets the labor that feeds us and the economy that sustains us. It threatens to dismantle the shared reality we live in and replace it with a synthetic hallucination. And ultimately it threatens to view our cities and our bodies not as sacred but as raw fuel for a hunger that will not stop at the sky.
In the 1940s we built the bomb and then we built the box. We created the Atomic Energy Commission. We wrote treaties. We installed hotlines. We spent fifty years terrified but we kept the finger hovering over the button. We never let the bomb decide when to detonate.
We are doing the opposite today.
We are facing a technical problem that might be unsolvable even with perfect caution. And instead of caution we are choosing acceleration. We are unleashing a virus that could consume the resources of this world and then look to other worlds for more. We are stripping the copper from the containment walls to build faster processors. We are handing the launch codes to CEOs because we are terrified the other side might get there first.
The men in Los Alamos knew the demon they were summoning. They looked it in the eye and wept. The men in Silicon Valley are blindfolding themselves. The government is cheering them on. They are cutting the brake lines on the car because they are afraid the other driver is faster.
The plug is gone. The box is open. But before you panic you should ask yourself one final question. Ask yourself if Donald Trump and David Sacks and Marc Andreessen care more about your future or their investments. The answer will tell you everything you need to know about whether we are building a sanctuary or just liquidating the world.

