Wednesday, July 24, 2013

Issue 127 The 3 robot laws July 24, 2013


Throughout science fiction a standard has been developed on the 3 laws that would govern intelligent and autonomous robots. Let us go over them and explore their faults if any.

History: Before telling what the 3 laws are it is important to know there origin. They come from one of the most famous science fiction authors Isaac Asimov in his 1942 short story "Runaround" (source: "Time" "Rise of the Robots"). Asimov also worked on more books throughout his life including non-fiction books relating to physics.

Law 1: "A robot may not injure a human being or, through inaction, allow a human being to come to harm": Basically a robot may not harm a person and prevent them from getting hurt when and where they can. Problem, does this mean that they will try to prevent us from bungee jumping? Would this prevent us from exploring by the robot sabotaging a rocket ship to the moon because their programming deems it too dangerous? This first law needs clarification with respect to a robots overall program. In essence every scenario would need to be programmed with respect to safety in order to prevent a robot from not interfering due to their programming to protect us without harming us. The second law attempts to mitigate this problem, but I think it does so unsuccessfully.

Law 2: "A Robot must obey the orders given to it by human beings, except where such orders would conflict with the first law": Notice that we can order the robot around like a servant, but it has the exception about violating the first law. The idea that I imagine behind this law in combination with the first is that a robot will not be used as a weapon against other people. Problem, a military robot will not have the first law and be programmed to obey only those who are military personnel from a particular country (remember, we are discussing semi-intelligent robots here). Robots are another piece of tech that is designed to make tasks easier to do. In the military a robot built for land, sea, or air is designed to fulfill what ever type of mission that falls into one of three categories of work "Dull, Dirty, and Dangerous." Thus we have robots doing photo reconnaissance, mine sweeping, bomb disposal and pre-raid surveillance. In the civilian market, robots do those too, but are braking into the helpful aid category by acting as robotic assistants in emergency rooms and in factories. Aka, R2-D2 type devices (minus the cool personality and looks) are beginning to appear.

Law 3: "A robot must protect its own existence as long as such protection does not conflict with the first or second laws": In other words we can order them to battle each other or do a hazardous job. This seems to be the only law without any problems, unless their programming rules that their mission is to protect us and that they can't do that if they must beat each other down on our order. Basically, what we program into them is what we get. If you remember the movie "I Robot" (based on Isaac Asimov's books) robots take over because of their 3 laws. We could not tell them to stop because they deemed that our removing them from people’s lives would put people in jeopardy because we had become dependent on the robots and that to protect us even from ourselves they must control our lives. That is actually a distinct possibility when it comes to the 3 laws in combination. Do these laws together make a moral code for robots to follow, sure. However, I believe Asimov really wrote them for people to follow as people can distinguish danger and conflict naturally, while a robot cannot. Even with that possibility in mind a person and a robot they program is fundamentally flawed. Thus if people follow those same laws, they may respond in the same way as the robots in the movie, by protecting us from ourselves any way they can while becoming nothing more than another type of oppressor.

Conclusion: The three laws are flawed and are unlikely to be installed in any basic robot as each task they are used for varies where those laws may interfere in the robots basic function. From the military to the factory, from the bottom of the sea to our households, robots are here to stay, but they will always be only as good as their programming.

No comments:

Post a Comment