Researchers Teaching Robots To Disobey Human Instructions; Bug-Zapping Robot Helps 8-year old; New Drone Regulations; Robot Tractors... and much more. Find out what's happening in the robotics universe this week. We select news that will certainly interest or amuse you. Enjoy your reading!
Robots are often thought of as mere slaves designed to obey our orders without question. The very word "robot" is derived from Czech words meaning "slave labor".
Huge industrial robots that operate at a safe distance from humans and other equipment in our factories are a good example of the traditional 'Human Overlord-Robot Slave' relationship. Such robots are fed strict instructions in the form of computer code and then proceed to carry them out without question.
But as humans and robots start working in ever closer proximity and robots enter our homes, the traditional relationship becomes a lot more problematic.
For a start, the environments that robots are operating in are becoming more complicated than a simple 'robot in a cage.'
Further, many domestic robot designs are based on robots receiving their instructions by voice rather than through computer code, which increases the range of potential mishaps that could occur.
With all this in mind, it's clear that a robot that followed human instructions without question every single time would be a bad idea, the most obvious example being when following a human's instructions could potentially cause the robot to damage itself or other property.
And let's face it, domestic robots are likely to receive all sorts of orders with varying levels of common sense and good intentions attached:
"I dropped my smartphone in the swimming pool. Can you get it for me?"
An instruction like that is only going to work well with waterproofed robots, but that won't stop some people from giving such orders, either without thinking or simply to see what happens next.
Put simply, robots need to have enough intelligence and awareness of their surroundings to enable them to distinguish between sensible and foolish instructions.
This is what a team of researchers at Tuft's University is hoping to achieve by developing mechanisms that will enable a robot to reject orders that it receives from humans, as long as there is a good reason for doing so:
Now, let’s talk about how all of this stuff works in practice, in real interactions between humans and robots. The overall goal here is not just to teach robots when they should (and should not) follow orders, but also to provide a framework within which the robot is able to effectively communicate why it rejected an order. This is important, because it allows the human to provide additional instructions that might satisfy whichever felicity condition caused the failure in the first place.
[...]The second and third law of robotics get switched here, since the robot ignores orders from a human when those orders would lead to it harming itself.
Are we setting a dangerous precedent that could doom humanity? Sure, maybe. But it’s not realistic to expect that robots will ever unquestionably obey the laws of all humans they come in contact with: if we tried to do that with computers and software, it would lead to an enormous and destructive mess, and the present and future of robotics is no different.
Read the full paper here.
There may be potential for robot disobedience beyond simply ensuring robot safety.
Maybe it's just me, but I think I would quite enjoy having a domestic robot that not only questioned my instructions when safety issues arise, but also outright contradicted me on a regular basis for no particular reason.
Why? Think brain-training device rather than robo-butler. Such a robot would keep me on my toes... hopefully without our relationship descending to Inspector Clouseau-Cato type levels:
The Federal Aviation Administration has released a list of recommendations (PDF) for how to better monitor recreational use of drones in the United States:
The group’s final recommendations include creating an electronic registration system accessible through the Web or an app, providing an electronic certificate of registration and a personal registration number that can be used on all of an owner’s UAS, and requiring that the registration number (or registered serial number) be marked on all UAS before they are flown.
The task force recommended that any drone with a maximum takeoff weight of 250 grams, just under 9 ounces, or more be registered. But several task force members suggested that a heavier weight might be more appropriate. AOPA and others noted that very little research has been done on the effects of a small drone strike, making it difficult to make a well-reasoned recommendation as to what size drone poses a meaningful threat. They urged the FAA to expedite its research in that area and review the weight requirement for registration based on its findings.
Under the task force recommendations, drone owners would need to provide their name and address in order to receive a registration number. Providing email addresses and phone numbers would be optional.
See also: 3 Problems with the Drone Registration Recommendations (RoboticsTrends)
Do you have an idea for a robot you think DARPA would be interested in? Now's your chance to let them know:
DARPA has teamed up with the Open Source Robotics Foundation to launch a program called the Robotics Fast Track (RFT). The goal is to give people outside the government a chance to pitch ideas for the future of robotics. The RFT is happy to accept any robotics or robotics software proposal, but it’s particularly interested in what people have in mind for maritime and space robots. These are considered the most challenging aspects of modern robotics.
Xenex's bug-zapping robot has come to the aid of Aydan Chapman, an 8-year old Texan boy who picked up a superbug infection while receiving chemotherapy for brain cancer:
1.7-million Americans will pick up infections at the hospital this year, often antibiotic resistant superbugs hard to disinfect, that can potentially be fatal. That has apparently happened to an eight-year-old Westlake boy. During the time he was receiving chemotherapy for his brain cancer, he got c-Diff, a superbug that kills one in ten people who get it, and lingers on surfaces for months. But Aydan Chapman is safely home now, his entire house disinfected by a superbug zapping robot.
Interview: Roman Yampolskiy Discusses AI, Killer Robots, and the Age of the Machine (Futurism)
How swarm intelligence could save us from the dangers of AI (VentureBeat)
Matt Reimer a farmer in rural Manitoba, has built a robotic tractor from open source components that can pull up alongside his combine, so harvested grain can be transported, all without a driver. (Audio: 9m14s)
The Drone Racing League Will Be a Spectator Sport Like No Other (Wired)
Robotic Surgery For Sleep Apnea (ABC)
Artificial Intelligence: 10 Things To Know (InformationWeek)
Robot to help passengers find their way at airport (Alphagalileo)
5 Places To Meet a Robot in Tokyo (Japan Today)
Flying Robots Are the Future of Solar (GreenTechMedia)
Robotic agriculture and swarming with bees (ABC)
China’s robot sector needs to pick up the pace to upgrade manufacturing and rival foreign competitors, say experts (South China Morning Post)