Feeling glum, happy, aroused? New technology can detect your mood

Story highlights

Explore the fast-developing world of technology that can read your mood

Cameras and electronic sensors can provide information on a person's state of mind

It opens up the possibility of communicating emotions we don't even realize we have

Advertisers, medical researchers and teachers are all interested in the tech

CNN  — 

A long distance drive can be lonely with only a radio for company, and if the driver is stressed or tired it becomes dangerous.

A car that could understand those feelings might prevent an accident, using emotional data to flag warning signs. Sensors could nest in the steering wheel and door handles to pick up electric signals from the skin. Meanwhile a camera mounted on the windshield could analyze facial expressions.

Alternatively, if the driver exhibits stress, the vehicle’s coordinated sensors could soften the light and music, or broaden the headlight beams to compensate for loss of vision. A distressed state could be broadcast as a warning to other motorists by changing the color of the vehicle’s conductive paint.

This empathic vehicle is the goal of AutoEmotive, a research project from the Affective Computing group at MIT’s media lab, who are focused on exploring the potential of emotional connections with machines. ‘AutoEmotive’ is their latest and most integrated project, following successful efforts to make interfaces of everything from bras to mirrors.

Researchers believe the concept is destined for the mainstream, and have fielded interest from manufacturers. “We have already tested most of these sensors”, says Javier Rivera, MIT researcher and project leader. “The hardware required could easily be built into cars. Most cars have cameras anyway; you just have more to capture the physiology. It could be done unobtrusively.”

Read: Embracing big brother: How facial recognition could help fight crime

Not time like the present

But we don’t have to wait for emotion sensors. They are flooding into a new market, using a growing range of mood metrics to suit diverse applications. Voice recognition app Beyond Verbal can tell you if you flirt too much in just 20 seconds. A sweater that detects skin stimulation to color code your feelings is available for pre-order.

The fastest-developing method is facial recognition, led by Affectiva, a start-up that spun off from MIT’s Affective Computing group three years ago. In that time, the company has amassed a database of over a billion facial expressions, which it uses to train algorithms to recognize and classify basic emotions such as happiness or anger, with over 90% accuracy.

Their flagship technology, Affdex, has been swiftly adopted by advertisers, who use it to test reactions to their campaigns, and modify them accordingly. Market research partners Millward Brown have standardized its use for Fortune 500 clients including PepsiCo and Unilever.

“In the past this technology was confined to laboratories because of high cost and slow turnaround,” explains Nick Langeveld, Chief Executive Officer of Affectiva. “We’ve cracked those issues; the cost is very low as the service is over the web, and it can be turned around almost immediately after the data is collected.”

Competitor Emotient also specializes in face recognition, but its primary target is the retail sector. Their software is on trial in stores, pinpointing 44 facial movements to monitor emotional reactions of staff and shoppers, as well as demographic information including age and gender. From customer satisfaction to employee morale, the benefits to business are obvious, and Emotient claim major retail partners plan to make the system permanent.

Read: Bionic fashion: Wearable tech that will turn man into machine by 2015

Medical applications

It is also time to bring these tools into clinical practice, believes Dr. Erik Viirre, a San Diego neurophysiologist. “While so many medications list suicide risk as a possible side effect I think we have to use biosensors, and there is a big push within psychiatry to bring them in. Thought disorders could be picked up much quicker and used to determine treatment.”

Viirre has studied headaches extensively and found that contributing factors build up days before they strike, including mood. He argues a multi-sensor approach combining brain scans, genetic tests and emotion sensing could dramatically improve treatment.

But emotion sensors are currently limited in their capacity to differentiate nuanced expression, says Tadas Baltrušaitis of the University of Cambridge Computer Laboratory, who has published research on the subject.

“It is easy to train a computer to recognize basic emotions, such as fear or anger. It is more difficult to recognize more complex emotional states, that might also be culturally dependent, such as confusion, interest and concentration.”

Read: Diana Eng melds high tech with high fashion

But there is scope for rapid progress: “The field is relatively new, and only recently has it been possible to recognize emotions in real world environments with a degree of accuracy. The approaches are getting better every year, leading to more subtle expressions being recognizable by machines.”

Baltrušaitis adds that combined sensors – as with ‘AutoEmotive’ – that pick up signals from skin, pulse, face, voice and more, could be key to progress.

Buyers beware

In this post-NSA climate, companies are keen to head off privacy concerns. Affectiva and Emotient are vehement that all their data has been gathered with permission from the subjects, while the latter defend their use of recognition software in stores by saying it does not record personal details.

But the technology is prone to abuse, according to futurist and information systems expert Chris Dancy. “I think variations are already being used in places like airports and we would never know”, he says. “I can’t imagine a system to take value readings of my mind for a remote company being used for good. It’s a dark path.”

Producers claim they strictly control the use of their sensors, but facial recognition technology is proliferating. UK supermarket Tesco could face legal action for introducing it in stores without permission, while San Diego police have been quietly issued with a phone-based version.

Read: Google Glass adds style, prescription lenses

Ironically, Dancy – a leading proponent of the Quantified Self movement – is pursuing many of the same insights into emotion as advertisers, but by alternative means and for personal goals. He keeps himself connected to sensors measuring pulse, REM sleep, blood sugar and more, which he cross-references against environmental input to see how the two correlate, using the results to give him understanding and influence over his mind state.

‘Moodhacking’ has become a popular practice among the technologically curious, and has given rise to successful applications. Members of London’s Quantified Self Chapter created tools such as Mood Scope and Mappiness that help the user match their mental state to external events. Hackers and makers will have an even more powerful tool in March, when the crowd-funded OpenBCI device makes EEG brainwaves available to anyone with a computer for a bargain price.

For all the grassroots hostility towards corporate use of emotion sensors, there may be convergence. Affectiva are keen to market to Quantified Self demographics and an Affdex app for android is imminent. As the machine learning develops, and different industries combine to join the dots, we can all expect to be sharing a lot more.

Related