How would
your computer respond if you looked frustrated or upset? Could your phone
comfort you if you were sad after getting a call? Could your smart home
adjust the music, lighting, or other aspects of the environment around you
after you’ve had a bad day at work — without being asked?
It may seem
far-fetched, but computers that can read your emotions and have some level of
“emotional intelligence” are not far off. The field is called affective
computing, and it’s being developed for use in many applications.
Affective
computing is not a new field but one that is becoming more relevant today,
especially if you combine them with big data, robotics and machine learning.
Why do we
want a computer to empathize with us?
Emotions
are a fundamental part of the human experience — but they’ve long been ignored
by technology development because they seemed difficult to quantify and because
the technology didn’t really exist to read them. This has resulted in sometimes
frustrating user experiences.
Programs will
soon use facial expressions and micro-expressions, posture, gestures, tone of
voice, speech and even the rhythm or force of keystrokes as well as the
temperature of your hands to register changes in a user’s emotional state.
Cameras and other sensors will send the
input data to deep learning algorithms that determine what your emotional state
might be — and then react accordingly.
And the
applications of these tools are practically limitless. E-learning programs
could automatically detect when the learner was having difficulty and offer
additional explanations or information. E-therapy could help deliver
psychological health services online and be as effective as in-person
counseling.
Companies
including the BBC, CBS, Coca-Cola and Disney are already partnering with Affectiva,
a leading company in recognizing facial expressions, to test the
effectiveness of advertisements, and how viewers react to film trailers and TV
shows.
The company
is now working with “a very large Japanese car company” to create in-car
technology that can sense when you’re drowsy or distracted, and can contact
emergency services or a friend or family member in an emergency situation.
Microsoft
even recently tested a bra that can sense stress levels. Other
applications are being created to help people on the Autism spectrum interact
with others. People with Autism typically have difficulty recognizing the
emotions of others, and small, wearable devices can help alert them to another
person’s emotions to help them react and interact in social situations.
Another
medical device can alert the wearer to changes in their biometric data (heart
rate, temperature, etc.) in the moments before, during, and after a dangerous
epileptic seizure.
Just as
“artificial intelligence” isn’t the same as human intelligence (computers
“think” in fundamentally different ways than the human brain), “emotional”
machines won’t really be emotional.
By
combining affective computing with machine learning, big data, and robotics,
you’re on the edge of a time when machines will at least seem to
respond to us with sympathy and other emotional responses.
Your
refrigerator might suggest you skip the ice cream tonight because your stress
levels are high. Your car might warn you to drive carefully because you seem
tense. Your phone might encourage you to take breaks because you’re
getting frustrated.
Robots already
exist that can recognize the faces of different family members and respond
accordingly. Soon they will be able to recognize your emotions as well and
offer helpful suggestions. The age
of the “emotional” computer is coming.
Article by Bernard Marr (edited)