Machines as caregivers

In one room, a simulator analyses the reactions of electrode-wired motorists so as to better understand their reflexes in different traffic situations. In another, a wheelchair and a leg exoskeleton are being controlled by the user’s thoughts, just like those screen-equipped robots that enable paralysed persons to move about virtually. This is all part of the research being conducted by the Chair in Brain-Machine Interface (CNBI) at the École Polytechnique Fédérale de Lausanne. It draws on artificial intelligence, which features health care applications that extend beyond the Watson supercomputer and its incredible accuracy in diagnosing disease.

AI and machine learning are the future of health care, with advances being made every day. Watson, the world’s leading diagnostic system, examined 1,000 cancer cases and recommended the same treatment as doctors 99% of the time. Even more amazingly, IBM’s software revealed that the oncologists had omitted care options in 30% of cases. Other companies are looking for ways to apply AI to health care. In 2014, for example, Google bought the British artificial intelligence company DeepMind with plans to apply its solutions to diagnostics. Dell, HP and Apple have initiated similar projects. Some predict a tenfold increase in the market over the next five years.

AI also opens up fascinating opportunities in machine and software automation. That is the focus of research being carried out by the CNBI at the new Biotech Campus in Geneva. The lab uses human brain signals to help people with disabilities, notably paralysis, control instruments and interact with their environment. Using headsets embedded with electrodes, researchers measure brain activity and align it with a patient’s intentions. These signals are then associated with commands that move a wheelchair or shift a cursor around a virtual keyboard. The user thinks about turning right, and the chair obeys. But it takes a lot of data and practice to make it work.

Repeating 50 times

A brain impulse is never exactly the same. It varies depending on factors including fatigue, stress and concentration. To build the most widely applicable models possible, scientists first have patients repeat each intention 50 or 100 times to collect the maximum amount of information. Even that is not very much, says Ricardo Chavarriaga, a researcher at the CNBI. “Compared with the number of movements it takes to learn how to walk, that’s just a tiny sample.”

Once data have been collected and the models constructed, testing can begin. A patient practises controlling actions and observes the results in real time. If the patient wants to turn right and the chair keeps going left, the researchers recalibrate the system manually or collect more data. AI comes in when the machine can also learn from the user on its own. If the user fails to direct a task, a brain signal will convey the feeling of failure. That signal is picked up by the machine, which recognises that it has made a mistake and will not repeat it. Similarly, a user’s signals of joy or satisfaction at the successful completion of a task will encourage the machine to replicate that action the next time.

“What’s interesting with this approach,” says Chavarriaga, “is that we don’t tell the machine, ‘You must do this or that’. Instead we tell it, ‘This task was not performed correctly. You have to figure out what to do in this situation.’ That’s really important because you can’t always tell the machine exactly what to do.” This is the very definition of machine learning. “These methods are used to help the machines figure things out by exploring,” he says. “The algorithm will gradually optimise the machine’s operation based on its findings.”

Smart furniture

Thomas Bock also uses a form of machine learning. He heads the Chair of Building, Realisation and Robotics at the Technical University of Munich (TUM). He rigs furniture with sensors that track a person’s health and alert a doctor if necessary.

Body temperature, for example, is measured in the bathroom using thermal imaging. “If the algorithm picks up an abnormal temperature compared with long-term records, it will investigate further,” Bock explains. It can combine those data with other information such as blood-oxygen levels, which are monitored by the kitchen chair. Cameras observe the user and determine the state of fatigue. The system combines big data and algorithmic analysis to gradually adapt to its user.

Bock is convinced that technology is the key to the challenges facing an ageing population. “With this system, we help the elderly remain independent and keep them living at home longer.” Testing is under way at a home in Italy, in partnership with private organisations and other laboratories at TUM.

Trust a trader’s guts

A study from the University of Cambridge has determined that traders with “interoception” – the ability to sense the state of their own body – are better than algorithms when it comes to forecasting financial markets. According to the study’s author, this “gut feeling” will be hard to apply to machines.


by

Tags: