Team players

When the world’s first hotel managed entirely by robots opened in Japan two years ago, the owners of the group behind the project explained that it was all about communication. The robots must not only understand the human guests, but also be aware of the other robot “colleagues”. This development is possible thanks to recent improvements in the analytical capabilities of machines, not least the increasing use of artificial intelligence.

A prime example is the cobot YuMi, launched in 2014 by the Swedish-Swiss industrial group ABB. This two-armed industrial robot uses a camera system to monitor the movements of its human co-workers; as soon as any of them gets too close, YuMi slows or interrupts whatever it is doing in order to protect its co-workers from potential injury. Haptic interaction is also possible. By simply grasping its arm and performing certain hand movements, a cobot can memorise a procedure and thus imitate its human colleagues. “These interactions between humans and robots are becoming increasingly reliable”, said Dirk Wollherr of the Department of Automatic and Control Engineering at the Technical University of Munich. This requires the machines to be fitted with multiple sensors. “Depending on what’s required, the robot can use its visual, haptic or auditory sensors to communicate with its environment.”

Drones in swarms

Drones in swarms only need to maintain contact with their immediate neighbour in the group. But, just like birds, they also develop an idea of the overall size of the group they are part of.

Google for robots

This environment will soon include robots which can exchange information amongst themselves. A spectacular step in this direction occurred in late 2014 with the launch of the RoboBrain platform developed by Stanford University. RoboBrain is a kind of search engine: a robot that encounters a problem can use this network to get information on how another robot reacted in a similar situation. Two years ago a robot was able, for the first time, to learn from another robot hundreds of kilometres away how to place mugs on bowls that had been turned upside down.

Wollherr foresees a significant challenge before such a system can enjoy widespread use. “Each robot model is different”, he says. “Unlike computer code, whose language is always the same regardless of the operating system, there is no common programming language for robots. A robot must therefore be able to replicate the action of another robot in its own model-specific setup.” Much more promising is a team consisting of the same type of robot; to this end, the two leading robot manufacturers, Japan’s Fanuc and ABB, are working together to develop a digital network-connected system for specific manufacturing processes. This could have huge benefits for industry, as it can often take several days to re-program an industrial robot to adapt to a change in the production process.

This also has great potential for robot-based communication off the factory floor. Dario Floreano, professor of Intelligent Systems at the École Polytechnique Fédérale de Lausanne (EPFL), and his colleagues are working on ways of simplifying the guidance systems for drones. “It is very difficult to operate a drone by remote control – drones for professional use require several days of training before they can be properly steered”, he said. With this in mind, the EPFL team is developing a soft exoskeleton for flying drones. The person wearing the jacket controls the drone’s flight by moving her upper body while she sees the world from the drone’s camera. “It only takes a few minutes to learn the procedure”, explained Floreano. At the same time, the drone acquires a greater degree of autonomy: it could, for example, recognise and correct pilot error. It could also assume full control of a flight path as soon as it detects an increase in a pilot’s stress levels. Such human-drone interaction could also help rescue the victims of natural disasters more quickly, or locate flight recorders after a crash.

Researchers are also working on interlinked systems. A potential application here would be the surveillance of fields, where a swarm of drones would analyse the condition of the crops and decide which parts of the field need to be watered or have fertiliser applied. The inspiration for this came from the animal world, as Dario Floreano explains: “A bird gets its bearings from the movements of other birds flying around it. It forms a group with them which in turn overlaps with other groups, thus forming a network. This makes the work of the developers easier: the drones only need to maintain contact with their immediate neighbour in the group but, just like a bird, they also develop an idea of the overall size of the swarm.” The same principle works underwater, as demonstrated by the CoCoRo project at the University of Graz in Austria: more than 40 small robots communicate using LED signals, moving around like a school of fish. The first applications include research in marine biology.

What is not a robot?

Some things it cannot be equated with:

Artificial intelligence

Not all robots are equipped with algorithms that encode the capacity to learn. AI comes in the form of software, like Google’s AlphaGo, which beat a master Go player last year.

Bots

Chatbots, twitter bots or shopbots are not robots simply because they are not embodied. They consist only of algorithms.

Computer-assisted surgery

Medical tools like the Da Vinci Surgical System, which increases the precision of surgery through its tiny wristed instruments, need directions to fulfil their intended function. Unlike robots, these systems cannot perform tasks by themselves. Surgeons are in control at all times.


Posted

in

, ,

by