Remember the good old days when the next foxhole over contained an actual human being?
"Numbers reflect the explosive growth of robotic systems. The U.S. forces that stormed into Iraq in 2003 had no robots on the ground. There were none in Afghanistan either. Now those two wars are fought with the help of an estimated 12,000 ground-based robots and 7,000 unmanned aerial vehicles (UAVs), the technical term for drone, or robotic aircraft."
The Reuters article goes on to talk about what robots are doing in the battlefield and how they can save human lives, as well as take them.
"Ground-based robots in Iraq have saved hundreds of lives in Iraq, defusing improvised explosive devices, which account for more than 40 percent of U.S. casualties. The first armed robot was deployed in Iraq in 2007 and it is as lethal as its acronym is long: Special Weapons Observation Remote Reconnaissance Direct Action System (SWORDS). Its mounted M249 machinegun can hit a target more than 3,000 feet away with pin-point precision."
So what could possibly go wrong?
"A recent study prepared for the Office of Naval Research by a team from the California Polytechnic State University said that robot ethics had not received the attention it deserved because of a "rush to market" mentality and the "common misconception" that robots will do only what they have been programmed to do.
"Unfortunately, such a belief is sorely outdated, harking back to the time when computers were simpler and their programs could be written and understood by a single person," the study says. "Now programs with millions of lines of code are written by teams of programmers, none of whom knows the entire program; hence, no individual can predict the effect of a given command with absolute certainty since portions of programs may interact in unexpected, untested ways."
That's what might have happened during an exercise in South Africa in 2007, when a robot anti-aircraft gun sprayed hundreds of rounds of cannon shell around its position, killing nine soldiers and injuring 14."
The article attempts to raise the question of ethics the way any modern one-page story does in our incomplete thoughts that scatter the Internet nowadays. (The irony is, I'm blogging this.)
"Beyond isolated accidents, there are deeper problems that have yet to be solved. How do you get a robot to tell an insurgent from an innocent? Can you program the Laws of War and the Rules of Engagement into a robot? Can you imbue a robot with his country's culture? If something goes wrong, resulting in the death of civilians, who will be held responsible?
The robot's manufacturer? The designers? Software programmers? The commanding officer in whose unit the robot operates? Or the U.S. president who in some cases authorises attacks? (Barack Obama has given the green light to a string of Predator strikes into Pakistan)."
No need to panic yet, it's not like they would deploy this technology at home. Right?
Just get nervous if anyone mentions Skynet.
As a closing, I leave you with the South Korean Guard Robot YouTube vid clip, and yes this is real.
A Single Cloud Compromise Can Feed an Army of AI Sex Bots
-
Organizations that get relieved of credentials to their cloud environments
can quickly find themselves part of a disturbing new trend: Cybercriminals
using...
2 days ago
1 comment:
There is no excuse for such bad background music. It was probably created by robots.
Post a Comment