The Cobot Experience: Elizabeth Croft & The Rules of the Handover
Posted on Aug 10, 2019 8:58 AM. 8 min read time
World-renowned human-robot interaction expert Elizabeth Croft talks human-centered design, the rules of the handover and why society must embrace cobot technology.
Credit: University of British Columbia
Elizabeth Croft, Dean of Engineering at Australia's Monash University, is one of the world's leading experts in the field of human-robot interaction. Over more than two decades, Croft has led large-scale collaborative robotics research projects in manufacturing and has guided multidisciplinary initiatives with General Motors, the German Aerospace Centre and other industry partners.
Previously working at the University of British Columbia, Canada, Croft was principal investigator for the CARIS (Collaborative Advanced Robotics and Intelligent Systems) lab.
Croft kindly agreed to be interviewed about her work and the place of cobots in manufacturing and society.
What are the questions that inspire and drive your research on human-robot interaction?
Human-robot interaction (HRI) is a wide-open, green field of inquiry and cobots present an open opportunity for creativity and human-centered design. It's psychology meets engineering and AI. It's a fun area, but it's also important that we engage and think about HRI seriously, because it's going to have a huge impact on the future of work.
So, I ask: 'What are the key things that we need to be able to do to enable effective human-robot collaboration?'
We need to be able to understand what each of the actors is doing. We need to be able to take turns. We need to be able to hand things back and forth between human and robot and we need to have a shared understanding of what the common goal of the collaboration is.
We also need to understand who is in charge at various points in the collaboration path. I want to try and tease apart some of these different collaboration problems like handover and turn taking and to look at these issues more deeply.
This involves trying to answer the question: 'What are the rules of engagement that allow shared spaces and collaboration between humans and robots to happen in a safe, flexible and efficient manner?'
Can you expand on your work around handovers between humans and robots?
One of the questions we asked in the CARIS lab is 'How do we have a handover occur where it's not a tug-of-war and it's not so mechanical that it feels unnatural?' The aim is to achieve handovers that are just as simple as me handing you a cup.
So, we started by studying human-to-human handovers. We used a baton with a force torque sensor and used motion tracking to track peoples' movements. People who did not know each other were able to pass the baton back and forth without any problems.
We measured all that data. We looked at the position data, the grip data on the baton and at who was bearing the load. When we broke that down we were able to work out the rules of engagement for human-to-human handovers.
Wow! So, what are the rules?
First, that the person doing the handing over of the object is responsible for the safety of that object, including making sure that it doesn't fall. We found that the person doing the handover should not let go until the the other person has fully taken the load of the weight of that object. So, the person doing the handover is responsible for ensuring a safe handover.
Second, we found that the person receiving the object is responsible for how fast the handover occurs. The faster they bear the load the faster the person that is letting go will let go. So, the person doing the handover is responsible for safety and the person receiving the object is responsible for timing.
We then encoded those rules as a control algorithm into a robot, which created a really nice, smooth handover algorithm that is so natural, people don't even have to learn how to do it and can immediately start using this interface.
See the CARIS lab's 'human-inspired object handover controller' in action in this video...
The really important thing that we learned from that experiment is that if you're going to have robots interact with people, especially in building-block-type collaboration, you really have to understand how people interact naturally and the social contracts that control activities such as handovers.
Are you building the robot around the person?
I'm building the robot around expectable behaviors and the application. People construct meaning from the physicality and behavior of robots. That meaning should align with the social contracts that we already have.
Robots have to be designed with the human and their work at the center rather than the robot at the center. It has to be designed around what the task is and what the person needs to get done, because what we are really trying to do with cobots and robot assistants is to help make the work experience better.
If cobots don't succeed in making work more enjoyable and more fulfilling, people are not going to use the technology.
A source of failure with many robotics projects is that we design these fantastic robots but people don't like them or don't find them useful, or they cause problems because the designers haven't thought in terms of human-centered design. For me, it always starts by understanding how people work and how people work with other actors and then designing the interactions around that initial learning.
Do you think current cobot makers are doing a good job of creating effective interactions between humans and cobots?
I think about [the dual-arm cobot] Baxter, which we all love and is a fantastic robot assistant with a fantastic story. We thought it was going to be all wonderful and helpful. The primary source of failure was Baxter's imprecise elastic actuators, but there was a deeper reason for that failure --not understanding what it is that people were trying to achieve with the cobot and how they were going to use it.
Baxter was designed to be really safe and people were comfortable using it, but it wasn't accurate enough to meet expectations. CGI is so good these days that peoples' expectations of what robots should be doing is not what robots are actually capable of. This really pulled people away from being able to use that kind of cobot.
The most successful cobots are super simple. For example, the KIVA bots in Amazon warehouses drive around, pick up payloads and transport them. Their interactions with people are very simple too. They just drive to where the person is.
Similarly, Universal Robots' cobots are very nice, simple to use and people are able to set them up very quickly.
You use the terms 'robot assistant' and 'cobot'. Is there a difference?
For me, a cobot is where you are co-lifting or co-manipulating something together and there is a prolonged physical connection between human and robot.
Whereas if I think of a robot assistant, there may not necessarily be a physical connection. If there is, it may be on and off like a handover or another task that involves turn taking.
So, while a cobot involves a human and a robot working together on a task, a robot assistant might have its own tasks. In that sense, it is just as much an independent actor as the human. That's my classification between cobotics and robotic assistants.
Where do you see cobots like Universal Robots' robot arm fit into that?
When it's joint lifting I would say that it's a cobot. But when you combine the robot with vision and algorithms to make it an independent actor then I think it fits more into the robot assistant category.
In general, are end-users generally best served by viewing their cobot as a colleague, a tool, a form of prosthesis, or some other category?
It depends on the user, the task and the mental construct that you are using to be able to help you get the job done.
For example, if you are responsible for control of the task, then the cobot is a tool. This is the best mental model if you're working with a cobot that provides the lifting, for example, but you make all the decisions about direction and speed.
But if you're using a cobot to bring you stuff and perform tasks side-by-side with you, while you perform your own tasks, then I think it's useful to think in terms of leader and follower.
Whether a cobot is thought of as a colleague or a tool comes around to how much you understand about the independence and intent of the robot. If your way of using the robot is to control everything that happens, then it's a tool.
Elizabeth Croft received the R.A. McLachlan Memorial Award from Engineers and Geoscientists BC in 2018. This video offers a brief overview of Croft's extensive research into human-robot interaction across many environments from cobots in manufacturing to last-mile delivery bots.
****Do you see cobots as a technology that could help mitigate job losses through automation by enhancing rather than replacing human labour?
I absolutely see that. Societies that have spent a lot of money on education and that provide high working wages, need to make sure that they offer people high-value, rewarding and interesting jobs. That's how we will be able to maintain --at least for some time, I hope-- the standard of living that we have become accustomed to.
We have to embrace cobot technologies in order to create this high-skilled, augmented human labor. If we don't, we will lose.
(Note: The interview was edited for length and clarity. It was conducted for educational purposes and the views expressed therein are those of the expert and do not necessarily reflect those of Robotiq.)
Leave a comment