Subscribe

Latest Blog Post

Get started with collaborative robots: scale your robotic capabilities

How a Collaborative Robot Uses Super Powers to ‘See’ You

Marc Prosser
by Marc Prosser on Nov 15, 2016 7:00:00 AM

Chances are that one of your coming colleagues will be a collaborative robot. So it might be good to know how such a collaborative robot senses its surroundings – and you, its human colleague. 

sensors-tool.jpg

The reason why more robots are being integrated in factories is simple maths. More and more companies are doing the calculations and finding that collaborative robots have a very short ROI and add great value to the company. Not only do they drive up efficiency, but also employee happiness by taking on boring, repetitive of tasks.

Collaborative robots are cheaper than their cousins in the industrial robot space, as well as reaching almost plug and play levels of user friendliness. They are well and truly within the economic and technical reach of small and medium-sized enterprises.

While your classic flesh-and-blood colleagues rely on the five senses that are the standard specifications of human beings, robots use a number of different ways to sense and interpret their environment. This includes how they ‘see’ human beings. It involves the likes of cameras and force sensors, as well as senses that would normally be associated with the realm of superheroes.

Together, they help determine how a collaborative robot operates in open industrial environments and interacts with human beings.

“We see a lot of rapid progress both from the sensors and from the perception software. In both cases, we will soon reach a price-performance point that will make these add-ons part of many collaborative robot installations. This will give the robot the ability to do more things by itself and improve how it interacts with its co-workers, Mathieu Bélanger-Barrette, production engineer at Robotiq, says.

Perhaps the best way of presenting how a collaborative robot sees its surroundings – and you - is to compare its senses to a couple of our human ones – as well as those from the realm of super heroes.

Sight

Humans are a race of twos. We are bipedal, have two manipulators with preinstalled end effectors, two audio receptors, two smell sensors – and we are bifocal.

The specs for the latter (pun intended) are pretty impressive. The same goes for collaborative robots, though.

Robots are often equipped with a single visual sensor. Some, such as Rethink Robotics’ Baxter and ABB’s YuMi come equipped with a camera, while others do not.

baxter-3.jpg

The camera – or cameras- are used by a robot to generate either 2D or 3D images.

2D cameras have traditionally been mounted in stationary positions and register what comes into their field of view. Imagine a camera that scans a conveyor belt for shapes that it recognizes. Once the camera finds such a shape, it can trigger different actions from the robot, such as picking up the object.

3D camera setups – which are also usually stationary - add depth to what a robot sees. It makes the robot able to perform tasks such as sort through a pile of different objects.

There is also a third way. It is an approach that involves placing a 2D camera on the arm of the robot. This allows the robot’s field of vision to travel where the arm does.

Robotiq recently launched the Robotiq UR+Camera, which lets companies add a Robotiq Wrist Camera to a Universal Robots arm. This allows you to easily include a 2D smart camera with integrated light source into your UR robot.

Jumping into the realm of super powers, some collaborative robots also uses the likes of lasers and infrared sensors. While some help a robot perform work tasks, they are usually safety features.

Collaborative robots can use lasers and infrared to tell that there is a presence around it. This presence is most often you, its human colleagues. This is often a 360-degree feature, so it gives the robot the equivalent to eyes in the back of its head.

The systems can be configured to slow down or even stop a robot once the worker is in a certain area/space.

This is referred to as speed and separation monitoring. It is a subject that is covered in depth in a number of articles on the Robotiq blog and in the company’s free eBooks.

human-robot_collaboration-resized-600.jpg

Feel

‘Use the force, Luke.’

Exchange ‘Luke’ for ‘collaborative robot’ and we are dealing with one of the most important robots senses – both for robot / human interaction and for what a collaborative robot can do.

The ANSI/RIA R15.06-2012 standard for collaborative robots defines a number of safety standards that ensure that robots and humans can interact without risk of injury. For example, it states how fast a robot should move and how much resistance should cause it to stop. The resistance recommendations are based on force, expressed in newton.

Imagine that you wander into the reach of the arm of a collaborative robot as it is working. It moves towards you, and accidentally pushes into you. Its registers – or ‘feels’ - that it has encountered something that it could risk damaging – or being damaged by - and stops. Since it was moving slowly, you receive no more than a soft push from the robot.

Force is also a deciding factor in regards to which tasks a collaborative robot is capable of performing.

When a human picks up an object, we use force to determine if the object is soft or hard. We also use force to decide how hard we hold the object.

Traditionally, the lack of force and tensile detection in end effectors has limited the use case scenarios for collaborative robots.

However, this has changed in recent years.

For example, Robotiq offers a number of Force Torque Sensors. These plug and play add-ons to Universal Robots-solution instantly gives a collaborative robot the ability to pick up and manipulate fragile parts that precious generations of robots would likely damage.

The force torque sensors allows your robot to perform tasks such as precision part assembly, product testing and take on new parts of the production process.

“With a force sensor, it is much easier to program a robot to insert a part into another, for example, knowing when it reaches the bottom of the host part. It is also easier to perform polishing or grinding, as it allows the robot to apply a constant force on the object it's polishing or grinding,” Mathieu Bélanger-Barrette says.

 

ft sensor adjusting position

 
x

Subscribe to Robotiq's Blog

Leave a comment

Marc Prosser
Written by Marc Prosser
Marc Prosser is a freelance journalist and editor, currently living in Tokyo, Japan. He is a self-declared geek, and focuses on science, technology and finance. His work has appeared in numerous media, including Singularity Hub, Forbes and the Financial Times.
Connect with the writer:

Related posts

Start Recording Force Values in a Quick and Simple Way

What's trending on DoF this week? Recording max force values, path with conveyor tracking, sharing emergency stops with other...

Amanda Lee
By Amanda Lee - May 25, 2017
The Future of Robotic Finishing Applications

What's trending on DoF this week? The future of finishing applications, new data logging program template, picking stacks of...

Amanda Lee
By Amanda Lee - May 18, 2017
Save Time Stacking or Unstacking Parts With the Force Torque Sensor

What's trending on DoF this week? Stacking/unstacking parts using the Force Torque Sensor, Gripper safety certification, train...

Amanda Lee
By Amanda Lee - May 11, 2017

Sneak Peek on the Upcoming Wrist Camera URCap Update for DoF Pros Only!

Hi Pros, I wanted to take a few minutes to talk about the upcoming release for the Wrist Camera URCap! We've noticed that a lot of people want to use the Wrist Camera to locate stock parts...Read more

Handy Helper Functions for Universal Robots

I thought that we might want a single thread where everyone can post up any handy little helper functions that they have developed over time that make programming and using the UR robots easier....Read more

Program speed

When the UR is in run mode, does the program speed slider default to 100%? Also, is there a way to use a script function of some sort to verify or set the speed slider to 100%? I am using the UR in a...Read more

Remote control of Universal Robots user interface

Hi. Has anyone tried to or does anyone know how to remote control Polyscope, the user interface of Universal robots? I’ve had the idea to install VNC server on a robot to be able to remote control...Read more