Right now, being angry with your computer does nothing, but what if your software could consider your mood? Affective computing lets a computer detect and interpret your emotional state (affect) and use it as a form of input.
Artificial (Emotional) Intelligence
In 1995 Rosalind Picard published a paper and a book outlining the basics of affective computing. The idea is to fill computers with emotional intelligence (EQ) in addition to the analytical intelligence that makes them so useful.
Affective computing allows a computer system to scan a person’s emotional cues such as facial expressions, tone of voice, body language, and words for information about their mental state.
Once the computer is sure of what its user is feeling, it reacts in a way (hopefully) useful to the user. There are many ways in which computers can use this information.
Remember Clippy the Microsoft Office Assistant? Imagine what Clippy could say when you were really frustrated and showed up only when you really needed help, instead of when you were just trying to do your job.
Computers Improve on Faces
People show emotions in a variety of ways, but our faces are the main canvas where we paint our feelings for the world to see. Even the best poker face can’t hide tiny microexpressionsalthough it is not yet clear how these should be interpreted.
When the original article on affective computing was written, the challenge of getting a computer to recognize and interpret a human face was really daunting. Now we have effective machine learning hardware in our devices that can recognize and map face in fractions of a second.
Of course, you need more than just the ability to recognize and map a face to get affective information from it, but at least now we can get the raw facial information with relative ease. This same machine learning technology, combined with heaps of facial datawill probably tickle the most important emotional information we need for affective computing to work well.
We Treat Our Computers More Like Humans
Computer interfaces are becoming more like us every day. Living things like humans need millions of years to change, but our computers are changing and improving at lightning speed.
In the early days, simple computers needed us to adapt to them by using them punched cardscrypto computer language, commanders, and possibly today’s graphical user interfaces. Touch screens have helped make computers easier to pick up and use for everyone because they translate our native space intelligence into a digital format.
Today’s computers are powerful enough to understand natural speech. You are more likely to deal with a virtual agent when you ask for help or information. We have voice assistants everywhere.
As computer interfaces become increasingly intuitive and natural, adding emotional information to that interaction could transform how well those interfaces work.
RELATIONSHIPS: How To Use A Voice Assistant Without It “Always Listening”
Emotions Are Difficult For People Too
Despite the fact that we have evolved to understand and express emotions, people are wrong all the time. While some people seem to have an almost supernatural level of emotional intelligence, for most people it remains a complex task.
So while affective computing sounds like a great idea on paper, in practice it’s not that simple. Even with all the amazing new technology we have. It is reasonable to expect that the first systems that use this approach in the mainstream will focus on a small set of crude emotional expressions.
If your computer knows you’re exhausted, it might suggest taking a break. If it knows that certain images in your image presentation to make you happier than others, it could put those on high rotation or add more images that are similar.
Clearly, there are many ways in which affective computing can benefit us, but don’t expect it to be perfect from day one!
The Dark Side of Affective Computing
Affective computing represents a major leap in how people interact with machines, but it also opens users up to new forms of exploitation.
Market psychology is already capable manipulating our emotions to change our buying behavior. That’s why car advertising focuses on how a car will feel to you rather than how much horsepower it has or how fuel efficient it is.
Much of our decision-making is driven by emotion, so imagine if social media companies could read your emotional reaction to posts or advertisements. One day you may have to tap the “emotional scan permissions” button along with those for you camera or microphone permission.
RELATIONSHIPS: 5 Psychological Tricks in Free Games (and How to Avoid Them)