Recently we’ve heard some great things about therapists Tes, Karim, Eliza and Ellie. So what, you might ask, there are probably lots of great therapists out there to choose from. What makes Tes, Karim, Eliza and Ellie unique is that they’re all artificial intelligence therapists. These AI therapists use pioneering software that enables them to both recognise and understand emotion in people and continually refine and develop that understanding. So how is robotic emotional intelligence learning new tricks?
AI, EQ and the 1960’s – just a trippy dream?
It’s easy to think of the marriage between emotional intelligence (EQ) and artificial intelligence AI) as relatively new, however, early pioneers of computer science like Joseph Weizenbaum began work on fusing AI and EQ as early as the 1960s. Whilst working at the MIT Artificial Intelligence Lab, Weizenbaum developed ELIZA, an early AI therapist. Weizenbaum described how his nontechnical staff at the MIT AI Lab interacted with the machine as though it were a “real” therapist, happily to spending hours divulging their personal problems to the program, believing ELIZA understood their problems and was capable of offering constructive solutions – a role the program was a long way away from being able to fulfil.
A more sophisticated virtual therapist
Since those early days at the MIT AI Lab things have moved on considerably. Ellie, the University of Southern California’s (USC) Institute for Creative Technologies AI therapist aims to treat people with depression and veterans with post traumatic stress disorder. Ellie uses a number of algorithms to algorithms to decide questions and motions to be used . A webcam and microphone enable her to monitor the patient providing her with feedback that assists her to know when to ask questions, nod and analyze emotional cues. Sixty six points on the patient’s face are observed, allowing facial expressions to be registered during the session, including a “flat” expression that is often displayed in depression. Speech rate and pauses in dialogue are also recorded, whilst a kinect sensor monitors visual cues such as posture, nods and eye movement. But is Ellie comparable to a real therapist? The Institute says no and Ellie herself is programmed to explain to clients from the outset that she “is not a therapist”, this is made clear so that clients understand why Ellie does not ask for further information of they disclose something serious.
Can robotic EQ compete with your emotional intelligence?
So virtual therapists still have their limitations but what about the recognition and simulation of human emotions in the world of affective computing more generally?
The ability to manage one’s emotions whilst making informed decisions in a rational way is one of the abilities that characteristics high emotional intelligence in individuals. Making decisions unfettered by strong emotion is something which AI does well. A few examples are:
Banks. A number of banks have begun using badges to transmit behaviour data on traders to boost performance. As Andrew Lo, Professor of finance at MIT explains,
“Imagine if all your traders were required to wear wristwatches that monitor their physiology, and you had a dashboard that tells you in real time who is freaking out,” Lo says. “The technology exists, as does the motivation—one bad trade can cost $100 million—but you’re talking about a significant privacy intrusion.”
The badges are equipped with microphones and proximity sensors, enabling managers to identify employees who are feeling “out of their depth” or highlight more advantageous behaviour that can be used to aid future team training. Does this mean AI is better than human managers at spotting people who are struggling in the workplace? Context is everything here, an emotionally intelligent manager will be able to discern between someone who is genuinely out of their depth and someone who is emotionally charged whilst on a resilient learning curve to success. Although AI can spot the similar data behaviour of both whether it can differentiate between the two is another matter.
Cab driving. A few years ago I witnessed a London cab driver unleash a torrent of abuse on a cyclist when both were stopped at a set of lights. From the back of the cab the female passenger shouted, “I’m paying for a journey not for aggression.” at which the cabbie disengaged from his argument. As driverless cars are provided with responsive technology to read the comfort levels of their passengers adjusting driving style to match the preferred driving style of the passenger, fraught cab journeys could soon be a thing of the past.
Medical technology. Deep reinforcement learning has enabled AI technology and it’s applications to progress rapidly. One are where deep learning is being successfully trialled is medicine. With health services becoming increasingly stretched, creating an increasingly pressurised and stressful working environment for health professionals, mistakes can be made. Enlitic, a US medical technology startup is currently trialling it’s technology in forty Australian clinics. In tests so far, Enlitic’s software was 50% better at classifying malignant tumours than clinicians and had no instances of missing cancer detection, compared with 7% for clinicians. Another of Enlitic’s systems, which examines X-rays to detect wrist fractures, also outperformed clinical experts.
What the future holds
Although robotic emotional intelligence has increased in sophistication, significant leaps in development need to occur before AI technology can offer a realistic alternative to human emotional intelligence. Jobs that rely on social awareness, social interaction and empathy to achieve results are unlikely to be replaces any time soon.