Industrial robots have been a familiar sight on factory floors for several decades, but now they are being
joined by a new breed known as cobots. Short for “collaborative robots”, cobots are designed to work alongside people, performing tasks that assist them – and vice versa. At BMW’s Dingolfing plant, for example, a ceiling-mounted cobot from German manufacturer Kuka takes on the repetitive strain of mounting gearboxes while its human counterparts safely add finishing touches with finesse and flexibility.
The first cobots were patented in 1999, designed as intelligent hoisting assistants for General Motors’ automotive plants, to help minimise injuries from ergonomically difficult tasks. Although small and nimble cobots accounted for less than 5% of global robot sales in 2015, Barclays Equity Research estimates that this $120 million market could jump to $3.1 billion by 2020 and $12 billion by 2025. That would mean 150,000 cobots sold in 2020 and 700,000 in 2025.
One person tracking this trend is Sebastian Pfotenhauer, an innovation research professor at the Technical University of Munich (TUM). “Cobots have the potential not only to fundamentally change the lives of workers but also to change public spaces and many public service sectors”, he says. One question at the heart of this transformation is this: if robots are increasingly being trained to work with and alongside humans, should be we trained to work with robots?
Pfotenhauer says it’s vital to understand the use of cobots as “a social interaction” that shapes our roles and identities, instead of purely an engineering endeavour. “It’s not just that the robot adapts to what humans do”, he argues. “It’s a mutually co-shaping relationship.” Pfotenhauer believes this could be as simple as understanding the requirements of looking after cobots in their intended setting, or as complex as adjusting to fundamentally altered social dynamics at workplaces or in daily life.
Maarten Steinbuch, chair of Control Systems Technology at the Eindhoven University of Technology, also predicts a leap in workplace robot interaction, although he doubts that new advances in industry will lead to many people working
elbow-to-elbow with cobots. “There’s the image of a physical robot sitting next to me performing tasks which are programmable and require accuracy, while I’m doing another task which may require more flexibility – but this might only happen 5% of the time in a manufacturing context. I think most of the time we will have optimised factory lines where robotic machines do their work and humans are used for many other tasks around a factory, including training robots.”
ABB’s YuMi
ABB’s YuMi, a collaborative robot that slows down or stops moving when a human worker gets too close.
Cooperation in hospitals
Nevertheless, Steinbuch believes that we need to continually re-evaluate our technology interactions. “Dutch start-up Smart Robotics has taken the initiative to re-educate people who are working on the shop floor so that they can work together with robots”, he explains. Their plan is to educate about 30,000 people over the next few years and actually teach them how to program and interact with robots instead of being pushed away by them. Some of the most popular cobots currently employed in factories have been Swiss firm ABB Robotics’ YuMi system, designed for consumer electronics assembly lines; Rethink Robotics’ touch-pad topped Sawyer and Baxter humanoids, used for anything from logistics to inspection; and the UR robotic arms of Denmark’s Universal Robots, which the company claims can automate virtually any manufacturing task.
But it’s outside factories that Steinbuch foresees the biggest future of work alongside robots, particularly in social settings such as hospitals, caring and frontline customer service. Medicine is a field in which surgeons are already using sophisticated machinery to perform complex operations they would be otherwise incapable of. Although smart, these so-called master-slave surgical technologies are often considered to be tools rather than robots. A case in point is the Da Vinci system, which enables precise keyhole surgery by translating a surgeon’s hand gestures into smaller, stabler and more accurate movements. But this distinction may soon no longer ring true. “My research group is also working on a robot for cochlear implants near the ear”, says Steinbuch. “You can envision that by using CT scan images, you no longer need the master-slave. You can also detach the robot from the surgeon and give it the autonomous task of performing the surgery itself while supervising, making it a true robot.”
In terms of the concrete skills that future workplaces may require, Steinbuch sees ever-increasing demand for engineers, as well as “basic robot maintenance skills”. Pfotenhauer adds coding to the technical skill sets of future co-workers, as well as the capacity for lifelong learning and “a greater attention to social and political education to enable workers to navigate and shape the robot transformation”.
Make children familiar with robots
With the creation of more collaborative jobs in the coming years, Pfotenhauer says we will have to assume the responsibility of governing cobot growth in a socially responsible way. He says two important moves will be for universities to emphasise social consequences in technical subjects like engineering, and for countries with economic reliance on highly automated industries to emphasise highly skilled or creative education. “If some jobs are going to be either reconfigured or replaced, then a lot of the value that humans bring to the economy will be either in creativity or social tasks requiring human interaction.”
Both professors agree that discussion of technology’s risks and promises have to be introduced at even earlier ages, to engage the youngest minds with the issues they’ll inherit. One potential candidate for this job is Roboy, a child-sized humanoid robot with synthetic muscles and tendons instead of motorised joints, safe for humans to directly interact with. Born in a Zurich artificial-intelligence lab, the pint-sized pioneer is evolving in a cross-disciplinary collaboration at TUM and has travelled with a robotics expert to visit universities, schools and events around the world to challenge people’s fears and preconceptions.
Meanwhile, researchers from MIT and Boston University are testing a system allowing people to correct robot mistakes using only brain signals. The system uses sensors to monitor a person’s brain activity as they watch a Baxter cobot perform sorting tasks, and identifies error-recognition signals to learn from mistakes. Researchers hope this could one day allow us to wordlessly, instantaneously direct our cobots.