March 11, 2015
Russia’s Killer Robot Army
While it might sound like something out of science fiction, the fact of the matter is that Russia is perfecting robot technology to the extent that by later this very year, their military robot prototypes will be advanced enough to clear an advanced obstacle course. This may not be incredibly concerning to some, considering the fact that the world’s militaries already possess equipment and weapons capable of mass destruction. However, ethicists and scientist alike are sounding the alarms.
Ethics of Killer Robots
Humanity is at a nexus. We have reached a level of technology and advancement where we can fairly easily create anything we can imagine. Unfortunately, we can’t always foresee all of the consequences of our creations. Known as LAWS, or lethal autonomous weapons systems (i.e., killer robots), this form of artificial intelligence has made some of the leaders in the robotics and artificial intelligence industries very nervous.
Most scientists believe that now is the time to set global restrictions and parameters regarding the creation and usage of such technology.
In many cases, international laws simply haven’t caught up to technological advances in weapons technology. While so-called “traditional” weapons of mass destruction are illegal under previous international agreements, our legal systems haven’t moved as fast as our technology and/or our ability to build newer, more advanced killing machines.
When you couple this gap in legislation with the fact that weapons can now be created that function and even arguably think for themselves, the disparity in theory and reality becomes even more glaringly apparent.
It’s not surprising, however, that some of the world’s leading LAWS-developing nations are opposed to creating treaties that limit the development/use of this kind of technology. It hasn’t really been until some rival nations began perfecting their own technology that murmurs of regulation have begun circulating.
The UK, the US and Israel are among the world’s most technologically advanced, so it’s not surprising that they lead the pack when it comes to these new-age weapons. They have collectively argued that current international laws are more than adequate to prevent technological abuse. Many ethicists, however, disagree.
Non-politically motivated developers, scientists, professors and legal experts agree that this technology opens the doors to issues unlike those the world’s ever known. First and foremost, they worry about accountability. Who do you hold accountable for the crimes of a machine with autonomy? The creator? The programmer? The government behind it? The machine itself?
While Russia is developing humanoid robots capable of independent function and even independent killing, drones and other machines that use this technology already exist and are even arguably being used.
The bottom line is that we are entering into a new world of technology, one in which our machines may end up more intelligent and capable than the people operating them. To most paying attention, it’s a recipe for disaster if not closely controlled.