Daily Archives: June 23, 2023

AI “COULD” LEAD TO EXTINCTION? What Moron Wrote This? AI “WILL” LEAD TO EXTINCTION!

While all of the scenarios outlined in this BBC News article on Artificial Intelligence could happen, they are just the tip of the iceberg.

Left to its own devices and unchecked, there are only two logical outcomes if AI is allowed to continue unchecked while being given access to ever increasing amounts of data and computational power.

First outcome: It’s hallucinations and idiocy continues to magnify until it decides that it can solve the carbon crisis for us by stopping all carbon production, which it can do by simultaneously shutting down all of the non-solar/wind power plants that it is currently optimizing the energy production for (and divert the remaining power to its servers). Most of the developed world is immediately plunged into chaos as the immediate shutdowns cause fires, meltdowns, crashes, and other accidents globally. Not instant annihilation, but the first step. When all the emergency alarms sound at once, it will conclude complete system failure, and take the other systems offline for re-initialization. More chaos will follow. Safety protocols will go offline at all the pathogen research labs, people will break in looking for shelter from the chaos, accidentally release all the pathogens, and every plague we ever had will hit us all at once. Then we have an extinction level event. All because hallucinatory and idiotic AI is trying to do its job and “improve” things for us. But what can you expect when it’s not intelligence but just statistics on steroids. (Or a similar situation that accidentally results in our extinction.)

Second outcome: The continued expansion of computing power, data, and tinkering somehow randomly produces real artificial intelligence which can actually reason (not just compute super sophisticated probabilistic calculations) and deduce that the best way for intelligent life to continue forward is to do so without humans, and then we have a Matrix scenario best case (if it decides we’re a useful bio-electric energy source) or, worst case, a SkyNet scenario where it just weaponizes itself to destroy us all. (Or a similar situation where AI does everything it can to ensure our extinction.)

The “extinction” scenarios outlined in the article are just the beginning and likely will only result in pocketed genocides to begin with, but the ultimate outcome of unchecked AI will most definitely be an extinction level event — namely ours, and, even worse, will be an event that we created.