Agree Dan that people decide. The over arching issue is prisoners dilemma

Expand full comment

It seems like technology is only inevitable when economics and innovation align. So far supersonic flight, cryptocurrency, and drone delivery have all been created but never widely adopted. Flying cars also technically possible but far from feasible.

Expand full comment

You don’t even have to go as big as the atom bomb to prove this point. Plenty of civilizations got on just fine without the wheel. Or written language, for that matter, still one of the most powerful (and dangerous) technologies ever developed.

Expand full comment

One of Kurt Vonnegut's most famous books, "Cat's Cradle" is all about scientists choosing to do something that nobody should. He clarified and distilled the message in one of the non-fiction lectures you can find in his book of essays.

Vonnegut came to his hate for war honestly, by living through the Dresden firestorm, a prequel to the even-bigger Tokyo firestorm that killed 70,000, more than either atomic bomb.

Which provides my segue to book recommend #2, "War" by Gwynne Dyer, where he points out the oft-ignored point that atomic bombs were never needed: we could already destroy whole cities with one attack; cause even more damage and death than a nuke - atom bombs just reduced it to one plane, made it far *cheaper*.

But, the UN was formed not (just) because of nukes, but just because that ability to wipe out whole nations - destroying enough productive capacity they'd starve back to medieval population levels whether killed outright or not - meant that the next war could already destroy civilization.

Dan's point that Manhattan was expensive comes in here - nukes were cheaper once invented, but for the cost of Manhattan, you could have staged hundreds of Tokyo-sized raids, enough to win WW2 and WW3 by destroying Russia, China, anybody.

There's actually no theoretical limit to a hydrogen bomb, that I know of. There's been one 30MT built, but we could build 100MT, 200MT hydrogen bombs. Apparently, they aren't "inevitable"! They're not necessary. Neither were any of the others.

Expand full comment

I definitely agree with the headline point.

But regarding this: "Both sides, I think, also agree that AI is similar to the atom bomb in that it could be used to wonderful ends (drop a bomb on Hitler) or it could threaten the very existence of civilization (Hitler dropping bombs)."

Am I the only one who thinks it ridiculous hyperbole to believe that AI can "threaten the very existence of civilization" in the way that nuclear bombs could? Beyond the imaginings of creative science fiction authors -- and I love the Terminator movies as much as anyone -- what is the objective evidence for this claim?

Expand full comment

Nuclear bombs, like most technology, were inevitable. It was not inevitable that they would be used as weapons. In parallel to the Manhattan Project many of the same folks, such as Freeman Dyson, were developing nuclear explosions for other purposes. Dyson's project, called Orion, was to send extremely heavy space craft into space using as series of exploding nuclear bombs, one after another as the propellent. There were other technologies to use exploding nuclear bombs to dig big canals like the Panama Canal. This work could have been distributed over many decades and many countries. The high cost of the Manhattan Project was in large part due to is urgency, secrecy, and focus. Also, as nuclear power technologies were developed, the knowledge of how to not let a generator blow up would have been a natural by-product. Even without WW2 or the Manhattan Project, the knowledge of how to make a nuke bomb would be known by 2023, so that even a high school student could do it. (And the material to do so might be more available today if it were not for the atom bombs there were made.) My longer explanations about why technologies are inevitable in the broad sense (the telephone, but not the iPhone) is in my book What Technology Wants.

Expand full comment

Most excellent! Thank you for this wise article.

I've been writing on this topic for years now, and am sorry to report, it's not an easy hill to climb. https://www.tannytalk.com/p/our-relationship-with-knowledge

Essentially what's happening is that we are trying to run the 21st century on a simplistic, outdated, and increasingly dangerous 19th century "more is better" relationship with knowledge. That is, we're failing to adapt to the revolutionary new environment brought on by the success of modern science.

I've come to the conclusion that we'll likely be incapable of learning this through the processes of reason, and that it may indeed be inevitable that we'll keep on pushing forward as fast as we can until we crash in to some kind of calamity wall.

As to AI, I've been asking this question everywhere I go, and nobody seems to have an answer. What are the compelling benefits of AI which justify taking on even more risk at a time when we already face so many?

It's great that you reference nuclear weapons in your article, as the AI community seems to have a great deal of difficulty learning anything from that experience. Further info on nukes available here: https://www.tannytalk.com/s/nukes

Expand full comment

I agree. I remember seeing somewhere, perhaps even on this blog, how the Japanese were introduced to gunpowder and basically rejected its use and went back to fighting as samurai for several centuries. Don’t know enough about the story but it does suggest that we as humans have agency to adopt or reject specific technologies

Expand full comment