The Aliens Have Landed: Discover How We Created Them

Last week in Time magazine, Eliezer Yudkowsky made a remarkably powerful disaster prediction. He wrote that if an AI with superior intelligence is created under the same conditions we have today, it is likely to result in the death of everyone on Earth.

Under current conditions, if somebody creates an AI with too much power, it is almost certain that every human and all living organisms will perish soon afterward.

Yudkowsky is not an unfounded prophet. He heads the Machine Intelligence Research Institute, a charitable organization based in Berkeley, California, and has already composed extensively on artificial intelligence.

I recall very clearly when I was researching for my book Doom, his warning that somebody could mistakenly construct an AI that would become our adversary.

Yudkowsky says:

“For example,”

“Because we tell it to halt climate change and it concludes that annihilating Homo sapiens is the optimal solution.”

Yudkowsky suggested revising Moore’s law some years ago: The minimum IQ level needed to cause global destruction will decrease by one point every 18 months.

Yudkowsky now believes that we are almost at a hazardous point where a created AI will be more intelligent than humans, which could have dire consequences.

Yudkowsky continues to say:

“Does not do what we want, and does not care for us nor for sentient life in general. … The likely result of humanity facing down an opposed superhuman intelligence is a total loss.”

He is advocating for a complete, worldwide ban on the production of AI as it could be used to construct synthetic life forms and thus deploy biological warfare against humanity. His point is unambiguous: we must end all development of Artificial Intelligence.

Several renowned individuals, such as Elon Musk and Steve Wozniak (Apple co-founder), have signed an open letter urging for a six-month stoppage in the progression of Artificial intelligence beyond their present level. This plea surpasses that of the signatories above.

Yudkowsky’s motivation is the same as that of other AI researchers: to make sure that artificial intelligence with superhuman capabilities is developed in a way that isn’t dangerous due to the lack of an international regulatory framework. The only difference between Yudkowsky and some others is his opinion on whether or not such a framework can be created within six months; it seems likely he’s right about this timeframe being impossible.

It is easy to compare artificial intelligence and robotics to nuclear weapons and biological warfare, two past areas of study that had the potential for catastrophic consequences. From the start, it was evident that these fields could cause immense damage or perhaps even humanity’s complete eradication if left unchecked.

In 1946, the United States presented the Baruch Plan to internationalize nuclear research to limit the spread of nuclear and biological weapons; however, this endeavor took much longer than six months and was only partially successful.

The Soviet Union refused the proposal, leading to an intense nuclear arms race. Despite this, a milestone was reached with the Non-Proliferation Treaty of 1970, which restricted the number of states allowed to have nuclear weapons, eventually reducing and even reversing the number of armaments possessed by the superpowers.

From considering the environment and biology of their imagined planet to exploring cultural and behavioral traits, there are many factors to consider when crafting an alien species. Technological advances and special effects have also allowed for more realistic and immersive depictions of these creatures on screen. However, creating aliens is not limited to the entertainment industry. The study of astrobiology and the search for extraterrestrial life also involves imagining and hypothesizing about potential alien species. As we continue to explore the universe and push the boundaries of our knowledge, the possibilities for alien life remain endless.


Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top