Market for Emotion Recognition Projected to Grow as Some Question Science 

By John P. Desmond, AI Trends Editor  The emotion recognition software segment is projected to grow dramatically in coming years, spelling success for companies that have established a beachhead in the market, while causing some who are skeptical about its accuracy and fairness to raise red flags.   The global emotion detection and recognition market is […]

Nov 30, -0001 - 00:00
 0
Market for Emotion Recognition Projected to Grow as Some Question Science 
Techatty Supportive Club
Techatty Supportive Club

By John P. Desmond, AI Trends Editor 

The emotion recognition software segment is projected to grow dramatically in coming years, spelling success for companies that have established a beachhead in the market, while causing some who are skeptical about its accuracy and fairness to raise red flags.  

The global emotion detection and recognition market is projected to grow to $37.1 billion by 2026, up from an estimated $19.5 billion in 2020, according to a recent report from MarketsandMarkets. North America is home to the largest market.  

Introducing Google Cloud

Software suppliers covered in the report include: NEC Global (Japan), IBM (US), Intel (US), Microsoft (US), Apple (US), Gesturetek (Canada), Noldus Technology (Netherlands), Google (US), Tobii (Sweden), Cognitec Systems (Germany), Cipia Vision Ltd (Formerly Eyesight Technologies) (Israel), iMotions (Denmark), Numenta (US), Elliptic Labs (Norway), Kairos (US), PointGrab (US), Affectiva (US), nViso (Switzerland), Beyond Verbal (Israel), Sightcorp (Holland), Crowd Emotion (UK), Eyeris (US), Sentiance (Belgium), Sony Depthsense (Belgium), Ayonix (Japan), and Pyreos (UK). 

Among the users of emotion recognition software today are auto manufacturers, who use it to detect drowsy drivers, and to identify whether the driver is engaged or distracted 

Some question whether emotion recognition software is effective, and whether its use is ethical. One research study recently summarized in Sage journals is examining the assumption that facial expressions are a reliable indicator of emotional state.  

Lisa Feldman Barrett, professor of psychology, Northeastern University

“How people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation,” stated the report, from a team of researchers led by Lisa Feldman Barrett, of Northeastern University, Mass General Hospital and Harvard Medical School.   

The research team is suggesting that further study is needed. “Our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life,” the report stated. 

Technology companies are spending millions on projects to read emotions from faces. “A more accurate description, however, is that such technology detects facial movements, not emotional expressions,” the report authors stated.  

Affectiva to be Acquired by $73.5 Million by Smart Eye of Sweden 

Recent beneficiaries of the popularity of emotion recognition software are the founders of Affectiva, which recently reached an agreement to be acquired by Smart Eye, a Swedish company providing driver monitoring systems for about a dozen automakers, for $73.5 million in cash and stock. 

Affectiva was spun out of MIT in 2009 by founders Rana el Kaliouby, who had been CEO, and Rosalind Picard, who is head of the Affective Computing group at MIT. Kaliouby authored a book about her experience founding Affectiva in the book, Girl Decoded. 

Introducing Google Cloud
Introducing Google Cloud

“As we watched the driver monitoring system category evolve into Interior Sensing, monitoring the whole cabin, we quickly recognized Affectiva as a major player to watch.” stated Martin Krantz, CEO and founder of Smart Eye, in a  press release. “Affectiva’s pioneering work in establishing the field of Emotion AI has served as a powerful platform for bringing this technology to market at scale,“ he stated.  

Affectiva CEO Kaliouby stated, “Not only are our technologies very complementary, so are our values, our teams, our culture, and perhaps most importantly, our vision for the future.”  

Kate Crawford, senior principal researcher, Microsoft Research

Some have called for government regulation of emotion intelligence software. Kate Crawford, senior principal research at Microsoft Research New York, and author of the book Atlas of AI  (Yale, 2021), wrote recently in Nature, “We can no longer allow emotion-recognition technologies to go unregulated. It is time for legislative protection from unproven uses of these tools in all domains—education, health care, employment, and criminal justice.”   

The reason is, companies are selling software that affects the opportunities available to individuals, “without clearly documented, independently-audited evidence of effectiveness,” Crawford stated. This includes job applicants being judged on facial expressions or vocal tones, and students flagged at school because their faces may seem angry.  

The science behind emotion recognition is increasingly being questioned. A review of 1,000 studies found the science behind tying facial expressions to emotions is not universal, according to a recent account in OneZero. The researchers found people made the expected facial expression to match their emotional state only 20% to 30% of the time.   

Startups including Find Solution AI base their emotion recognition technology on the work of Paul Ekman, a psychologist who published on the similarities between facial expressions around the world, popularizing the notion of “seven universal emotions.”   

The work has been challenged in the real world. A TSA program that trained agents to spot terrorists using Ekman’s work found little scientific basis, did not result in arrests, and fueled racial profiling, according to filings from the Government Accountability Office and the ACLU.   

Dr. Barrett’s team of researchers concluded, “The scientific path forward begins with the explicit acknowledgment that we know much less about emotional expressions and emotion perception than we thought we did.”  

Affectiva/Smart Eye Welcomes Further Research

Reached for comment by AI Trends after the acquisition of Affectiva, Gabi Zijderveld, now Chief Marketing Officer of Smart Eye, said the company welcomes further academic research into emotion recognition. “We do a lot with MIT,” for example, she said. For example, research is currently ongoing on the impact of cognitive load, when a person’s mind might wander when they are driving a car, “to see if there are facial indicators that correlate with this state of cognitive load,” she said.

As a member of MIT’s Advanced Vehicle Technology Consortium, Affectiva/Smart Eye participates in research around how drivers engage with vehicle automation. One project involves recording people inside vehicles outfitted with multiple cameras, at all times of the day and days of the week, to pick up differences in behavior. This provides context for emotion recognition, a point Zijderveld emphasized in response to criticism about the validity of emotion recognition software. 

Some criticism has been that computers cannot know with certainty what someone is feeling. “We believe context is the key,” she said. “Facial expressions are a reaction to certain contexts. It is a temporal state that evolves over time. You can’t just look at isolated images,” she said. 

Affectiva trains its algorithms based on video data, so they can see the onset, progression and ending of a state. Drowsiness, for example, “is a complicated state that involves different stages.” Gathering data on it is difficult. Affectiva enrolled shift workers driving home after work to help gather the data. “It helps you get the subtle nuances, so you don’t have a system shooting off alerts if someone yawns one time,” she said. 

Automotive safety, a key focus of the combined company, is likely to be “multi-modal” in the future, with vision systems that detect emotions interacting with lane drift data, for example. “It’s unlikely to rely on computer vision alone,” she said.

Rana el Kaliuby will be the Deputy CEO working with Smart Eye founder Krantz going forward. She will remain primarily in the Boston area, giving the combined company access to the startups, incubators, venture capitalists and academics in the region. “She has many more plans,” Zijderveld said. 

Read the source articles and information from MarketsandMarkets, in Sage journals, in a press release from Smart Eye, in Nature and in OneZero. 

Techatty Connecting the world of tech differently! Read, Learn, Thrive, and Make an informed decision without distractions. We are building tech media and publication networks to connect YOU and everyone to reliable information, opportunities, and resources to achieve greater success.
Web and Cloud LLC - talk to us and let's discuss your needs.
Let's help transform your business