If you happen to keep up with the latest trends in robotics and industrial automation, then terms like teleoperation, telepresence, and teleoperated robots probably sound familiar. Teleoperated robots are remotely controlled robots, they might have some sort of Artificial Intelligence, but normally they take their command from a human operator and execute exactly as instructed.
Right now, teleoperated robots are mostly used in medical surgeries and military operations. Critical surgeries are made easier with teleoperated robotic arms or tools due to their ability to reach the tightest places where human hands can’t operate. In military operations, teleoperated robots help to gather Intel and perform dangerous tasks like diffusing or moving an explosive. Until recently, these teleoperated robots used to be controlled by a joystick style setup or console-like controllers, pretty similar to what you have on your PlayStation, Xbox or Wii consoles. With advanced Virtual Reality and Augmented Reality technologies, teleoperated robots are entering a new spectrum – VR and AR controlled teleoperated robots.
Primitive versions of Virtual Reality and Augmented Reality emerged long ago, but lately, through modern technology, various smartphone apps and standalone consumer hardware, VR and AR are available to just about anyone. Recently, MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) came up with a VR-based controller for Teleoperated Robots using an Oculus Rift headset. Oculus, which was acquired by Facebook in 2014, has become one of the industry leaders in AR/VR technology.
The system prototype that MIT’s CSAIL created, works by receiving data input from various sensors placed across a room. These sensors generate environment data for better operation of the robots. Wearing a VR headset, the human operator can see through the robot’s eyes and make movements which the robot will mock. However, MIT has created two separate models for interacting with the robot – the direct model and the cyber-physical model.
In the direct model, the user sees what the robot sees through the VR helmet. This method provides more attachment with the robot as the human feels exactly like being inside the robot himself. However, ping between a VR controller and the robot is still significantly high and the lag could often cause nausea for the operator.
The other model is called the cyber-physical model. The user works in a virtual copy of the robot and the environment its working on. User’s interaction with the virtual copy is replicated by the actual robot. Since the robot and the human user is detached, those minor lags won’t affect the human operator. However, for this model to work, more data and a specialized space solely for the cyber-physical model are required.
VR and AR are technologies with different applications, in a consumer market. But in industrial automation, they go along hand in hand. In regards to the MIT CSAIL project, the Oculus headset and controller make the robots move in a specific direction, grip factory tools or materials, and place or retrieve items. Both the human’s and the robot’s movements are mapped into a virtual space, every action and its reactions are buffered through here. This virtual co-location ecosystem and faster bandwidth in the buffer allows for lag-free interaction between the human operator and the robot that’s down in the production line.
Augmented Reality is incorporated within the virtual space technology. Using AR, an operator can put a virtual object(s) on a real object, superimpose images in real time, take measurements in 2D and 3D space, built a 3D model of something in the virtual space, do calculations and effectively transmit all the gathered data. Before initiating a task in real space, AR can provide a projection of the possible output in the virtual space.
Other than manufacturing industries, hospitals are the prime examples where VR and AR could offer the most precise solutions to problems. Crushing kidney stones or replacing a heart valve is more precise and safer when the surgeon is able to control a robotic arm through VR, as well as using AR to map out the body, or superimpose any x-rays or MRI's directly onto the body for more accuracy during procedures.
For industrial automation, VR and AR for operating Teleoperated Robots are still in developmental phases. Universities like MIT and others are working to create better prototypes that can be quickly adopted to optimize certain tasks, but few industries have adopted the use of VR and AR to control their teleoperated robots. Boeing and Ford are among the first in the industry to come forward in adopting AR and VR by using Google Glass and the Microsoft HoloLens to assist in the construction and design of their airplanes and cars. I expect many large manufacturing companies to follow suit soon, making AR and VR a standard tool by 2020.