Affective Computing Could Change the Future of Computer Interaction


Right now, being mad at your computer is pointless, but what if your software could track your mood? Affective computing allows a computer to detect and interpret your emotional state (affect) and use it as a form of input.

Artificial intelligence (emotional)

In 1995, Rosalind Picard published an article and a book describing the fundamentals of affective computing. The idea is to imbue computers with emotional intelligence (EQ) in addition to the analytical intelligence that makes them so useful.

Affective computing allows a computer system to scan a human’s emotional indicators such as facial expression, vocal tone, body language, and words to gain insight into their mental state.

Once the computer is sure what its user is feeling, it reacts in a way that is (hopefully) beneficial to the user. Computers can use this information in many ways.

Do you remember Clippy, the Microsoft Office assistant? Imagine Clippy could tell when you were really frustrated and only show up when you really needed help, instead of when you were just trying to get your job done.

Affective computing could even be put to good use in games, virtual reality applications, or when interacting with natural computer interfaces such as Siri.

Computers are getting good for faces

Humans display emotions in a variety of ways, but our faces are the main canvas on which we paint our feelings for the world to see. Even the best poker face can’t hide tiny micro-expressions, though it’s still unclear how these should be interpreted.

When the original affective computing paper was written, the challenge of getting a computer to recognize and interpret a human face was truly daunting. We now have efficient machine learning hardware in our gadgets that can recognize and map a face in fractions of a second.

Of course, you need more than just the ability to recognize and map a face to get affective information from it, but at least now we can get the raw facial information with relative ease. This same machine-learning technology, combined with piles of facial data, will likely reveal the most important emotional insights we need for affective computing to work well.

We treat our computers more like people

An iPhone screen showing the digital assistant Siri.

Computer interfaces look more like us every day. Living things like humans take millions of years to change, but our computers evolve and improve at lightning speed.

In the beginning, simple computers needed us to adapt to them using punch cards, encrypted computer language, command prompts, and eventually today’s graphical user interfaces. Touchscreens have helped make computers easier for everyone to pick up and use, as they translate our innate spatial intelligence into a digital format.

Today, computers are powerful enough to understand natural speech. You are more likely to deal with a virtual agent when asking for help or information. We have voice assistants everywhere.

As computer interfaces become more intuitive and natural, adding emotional information to this interaction could transform how these interfaces work.

RELATED: How to Use a Voice Assistant Without It “Always Listening”

Emotions are also hard on people

Despite the fact that we evolved to understand and express emotions, humans get things wrong all the time. While some people seem to have an almost supernatural level of emotional intelligence, for most people it’s still a complex task.

So while affective computing sounds like a great idea on paper, in practice it’s not that simple. Even with all the amazing new technology we have. It is reasonable to expect that the first systems that use this approach in the mainstream will focus on a small set of raw emotional expressions.

If your computer knows you’re exhausted, it may suggest you take a break. If it knows that certain images in your wallpaper slideshow make you happier than others, it might put them on high rotation or add other similar images.

Obviously, affective computing could benefit us in many ways, but don’t expect it to be perfect from day one!

The Dark Side of Affective Computing

Affective computing represents a significant leap in the way people interact with machines, but it also opens users up to new forms of exploitation.

Marketing psychology is already adept at manipulating our emotions to change our buying behavior. That’s why a car ad focuses on how a car will make you feel rather than how powerful or how fuel efficient it is.

A lot of our decision-making is driven by emotion, so imagine if social media companies could read your emotional reaction to posts or advertisements. One day you may need to press an “Emotional Analysis Permissions” button along with those for your camera or microphone permission.

RELATED: 5 psychological tricks in free games (and how to avoid them)

Sherry J. Basler