Subscribe

Latest Blog Post

Integrate two grippers and one UR robot  

How Fast Is Your Collaborative Robot Getting ‘Smarter’?

Marc Prosser
by Marc Prosser on Mar 27, 2017 7:00:00 AM

One of the great things about collaborative robots is that they’re very good at learning on the job. Perhaps it’s more correct to say that the people building and programming them are constantly learning how to improve them by observing how the robots perform in factories and other production environments. These insights form the foundation of further development of technologies and systems that boost collaborative robots’ potential.

“The main driver is that robots and their tools are becoming easier to integrate and program. This has a direct effect on what tasks are easy or difficult to hand over to a collaborative robot,” Nicolas Lauzier, R&D Director at Robotiq, says.

To mention a few examples:

  • Robotiq Wrist Camera makes it easy for collaborative robots to locate objects on a plane using a compact camera on the wrist of the robot.
  • ArtiMinds RPS offers intuitive, easy programming of force-control applications that play a central role in connection with a number of tasks.
  • Pick-it 3D vastly enhances the efficiency of bin picking applications.

Illustrating the Easy and the Hard

Infographie_EasyOrHardToAutomate_V3.jpgTowards the end of 2015, Robotiq published an infographic describing what tasks are generally easy – and which are difficult - to hand over to collaborative robots.

In the ‘Easy’ column were tasks such as picking and placing parts, distinguishing between a limited number of known parts, ordering shapes in a tray or a pallet and carrying out repetitive tasks that require the same steps every time.

The ‘Hard’ column was dominated by tasks where things become more varied. For example grinding with varied force, some kinds of precision assembly, distinguishing between a wide variety of shapes, bin picking different parts and carrying out tasks requiring complex logic conditions and/or many different kinds of robot sensor input. 

Just two years later, the infographic serves as a great yardstick for how quickly collaborative robots are progressing and evolving.

The Grind Becomes Easier

Sometimes we forget the marvel of construction that our body is. What seems like simple tasks often requires highly advanced interaction between manipulators (for example our arms), end effectors (hands and fingers) and our sensory and software systems (senses and brain).

Tasks like grinding or polishing painted parts are prime examples. The supple manipulations of pressure, angling the polisher and knowing when to move on to a different part might seem straightforward to you and I, but not for a robot.

However, that has changed dramatically over the last couple of years. Thanks to massive advances in fields such as force control, collaborative robots from companies like Universal Robots are now capable of performing most basic forms of grinding and polishing. It should be mentioned that tasks that are very non-repetitive and require a lot of variation in pressure, like polishing car parts, are still solidly in the ‘Hard’ column.

Finding the Right Shape

As described in the free Robotiq eBook ‘Robotic Bin Picking Fundamentals’, the ability to sort a variety of unordered shapes has long been considered a holy grail of robotics.

Big advances are now being made thanks to the combination of systems like the Robotiq Wrist Camera and ArtiMinds software. Other robotics companies, such as FANUC, have also showed rapid progress.

“I would say that random bin picking is already approaching mainstream,” David Dechow, Staff Engineer-Intelligent Robotics/Machine Vision at FANUC America Corporation, recently told Robotics.org.

Intuitive Programming Part of the Key

The two areas mentioned above serve as illustrations of just how rapid collaborative robots are progressing. Similar advances are happening across the board.

Nicolas Lauzier sees intuitive programming structures and integration of sensors as central to the speed of the advances. They mean that collaborative robots can now be installed and programmed very rapidly, i.e. sometimes in just one day. In short, he sees robots as smart when they can do complex applications even when trained by non-expert humans.

“Robots become smart when they use things like force/torque sensors, cameras and software to adapt to their environment. This integration enables them to make more complex decisions,” he says.

“In the last couple of years, we have seen the rise of various technologies aimed at reducing the complexity of the integration for applications and increasingly include plug & play components, easy-to-use offline programming software and self-adapting algorithms - including deep learning. In a certain way, we can say that the robots are becoming ‘smart’ because they easily understand the non-expert human input and transfer this knowledge to execute tasks in a robust and repeatable way.”

getting started with collaborative robots

 
x

Subscribe to Robotiq's Blog

Leave a comment

Marc Prosser
Written by Marc Prosser
Marc Prosser is a freelance journalist and editor, currently living in Tokyo, Japan. He is a self-declared geek, and focuses on science, technology and finance. His work has appeared in numerous media, including Singularity Hub, Forbes and the Financial Times.
Connect with the writer:

Related posts

Why Humans Should Stop Worrying About Robots Taking Their Jobs

When the topic of robots in manufacturing comes up, the first thing people talk about is a fear of losing their jobs to robotic...

Mariane Davids
By Mariane Davids - June 19, 2017
What's New in Robotics This Week - Jun 16

Manufacturing & cobot roundup, the Tertill launch, Softbank-Boston dynamics deal, 2060: It begins and much more! We hope that...

Emmet Cole
By Emmet Cole - June 16, 2017
How Today's Collaborative Robots Are Cheaper Than Fenced Options

Until recent years, robots in manufacturing were limited to fenced options that were large, took up a lot of space on the...

Mariane Davids
By Mariane Davids - June 13, 2017

Yes, We Now Support Two Grippers on One UR Arm!

...actually, we support up to four Grippers on the same arm.Our new Gripper URCap software upgrade now includes multiple Grippers control. But that’s not all!Here is the list of new features and bug...Read more

Remote control of Universal Robots user interface

Hi. Has anyone tried to or does anyone know how to remote control Polyscope, the user interface of Universal robots? I’ve had the idea to install VNC server on a robot to be able to remote control...Read more

Improve cycle time when using the Robotiq Gripper

Hi Pros,In some applications where cycle time is critical, saving a few seconds here and there can be important. Here's a new program template that allows you to save a few tenths of seconds...Read more

Allighnment with integrated force sensing of UR5

In our application we have a UR5 with a Robotiq Gripper, Is it possible to use a force feedback to aid in aligning the palletizing program to the presented pallet or work surface. I began...Read more