David R. Henderson  

This is Scary

Looking for AD in all the wron... "Forcing" the GOP to accept ne...

In our discussion about "designer babies," I took on Dan Klein's main argument against them--that we might get super achieving babies and, thus, lose our coherence with the past.

In the comments on my post on this topic, Dan raised a new concern that seemed to have nothing to do with designer babies. He wrote:

Technological progress will continue to lower the cost of destruction. Extrapolate the progress to a scenario in which any nutjob with a softball-sized device can take out a city block. There is a vast asymmetry between production and destruction, an asymmetry that should make us very concerned about the declining cost of destructive technology. You ask, "Should We Fear Progress?" In this matter, surely you agree we should have apprehensions about progress.

I do agree. This one makes me afraid, especially if it gets in the wrong hands. The "wrongest hands" would be those of terrorists. A close second would be the hands of governments. Of course, it will get in the hands of governments because the U.S. government is funding it.

The whole thing reminds me of the creepy Project X in Atlas Shrugged.

Comments and Sharing

COMMENTS (13 to date)
Brandon Berg writes:

This has been a concern at the back of my mind for years. It seems inevitable to me that we will, in the next fifty years, reach the point where anyone with knowledge equivalent to a bachelor's degree in the relevant field can engineer a plague, whether in the form of a virus or a nanobot, capable of killing tens or hundreds of millions of people.

I don't see an obvious solution to this problem.

Brad writes:

Nuclear weapons have existed for 1/2 century but only used twice. The threat of miniature nukes seems to have peaked long ago without incident. So, while this latest CHAMP weapon's novelty raises concerns, the perceived danger is probably higher than reality. We haven't had a 9/11 style attack since that day, even though the cost of one (a band of terrorist attacking crowded shopping malls seems fairly easily done) is fairly low - especially in our open society. I think the greatest defense against such things is a people who value liberty and freedom above else.

Back to my pool....

Scott Sumner writes:

I'm very worried about bioterrorism. Not right now, but in a few decades.

Shane L writes:

The "wrong hands" could just be one deluded individual. In a few centuries we've gone from spears to nukes. In a few centuries more where will we be? How about in 10,000 years? Eventually it strikes me that it might be so cheap and easy to wreak enormous damage that all life might be vulnerable to just one individual crank - and we have lots of individual cranks.

That is a scary thought! However I hope technology to control or block such destructive energy might advance sufficiently too that individuals are not able to wield such god-like powers with no defences for the rest of us.

Ben writes:

This problem – the trend toward humanity's power increasing without bound while our maturity and sanity do not – is precisely why I have chosen not to have children. It will not end well. I find this the most compelling answer to Fermi's Paradox. Where is everybody? They have all self-destructed.

Chris writes:

Anyone else seen Forbidden Planet?


One of many ways one can envision us destroying ourselves.

"There is a vast asymmetry between production and destruction"

Is there? It's not impossible to shield against these pulses.

Pajser writes:

The solution is in artificial intelligence. If needed, on every single man there will be hundred of intelligent robots whose only purpose is to check if their human is making some trouble. The transformation of UN in the world state might be completed in, say, 200 years. It is possible, but improbable that the world state will be destructive. The humanity has a fair chance to survive. Not that it is deserved.

TSB writes:

On the flipside, I have some hope that the threat of low cost, easily targeted WMDs will reverse the growth of very large cities, hopefully without ever being demonstrated. I worry that population concentrations larger than a few hundred thousand have external diseconomies that outweigh the benefits. The national economic and individual threat of living in an obvious target might reinvigorate happy provincialism, while mass media and communication technologies mitigate the otherwise attendant narrow-mindedness.

--> Honey, the TV is on the blink. And, everything else.

EMP Attack: A Primer and Suggestions for Preparedness
=== ===
Consider a nuclear explosion 200 miles up. It emits a powerful wave of radiation which causes a huge electrical wave. You won't feel much, but it will fry all the electrical systems and digital electronics that we depend on to run everything.
=== ===

NZ writes:


Funny you should bring up artificial intelligence. AI seems to be the one technology right now that's receiving even MORE thoughtful deliberation over its ethics and long-term implications than genetic engineering.

As I've been saying, we ought to give this kind of consideration to all new technology, not just AI and genetic engineering. One could even argue that the technologies most in need of this kind of thought are the ones the public is most likely to accept uncritically, e.g. social media or automotive electronic cognitive assistance.

Jeff writes:

Well, at least rebuilding will be pretty cheap. In general, I share your concerns, though.

Mark Bahner writes:
I'm very worried about bioterrorism. Not right now, but in a few decades.


One thing I think the Terminator movies got wrong (by not including) was idea of Terminators not using guns/explosives, but instead biological weapons.

For all the (literally) millions of hours spent worrying about climate change, there has been orders of magnitude less concern about things that have a much higher probability (in my considered opinion) of killing orders of magnitude more people.

Comments for this entry have been closed
Return to top