By John P. Desmond, AI Trends Editor
The emotion recognition software segment is projected to grow dramatically in coming years, spelling success for companies that have established a beachhead in the market, while causing some who are skeptical about its accuracy and fairness to raise red flags.
The global emotion detection and recognition market is projected to grow to $37.1 billion by 2026, up from an estimated $19.5 billion in 2020, according to a recent report from MarketsandMarkets. North America is home to the largest market.
Software suppliers covered in the report include: NEC Global (Japan), IBM (US), Intel (US), Microsoft (US), Apple (US), Gesturetek (Canada), Noldus Technology (Netherlands), Google (US), Tobii (Sweden), Cognitec Systems (Germany), Cipia Vision Ltd (Formerly Eyesight Technologies) (Israel), iMotions (Denmark), Numenta (US), Elliptic Labs (Norway), Kairos (US), PointGrab (US), Affectiva (US), nViso (Switzerland), Beyond Verbal (Israel), Sightcorp (Holland), Crowd Emotion (UK), Eyeris (US), Sentiance (Belgium), Sony Depthsense (Belgium), Ayonix (Japan), and Pyreos (UK).
Among the users of emotion recognition software today are auto manufacturers, who use it to detect drowsy drivers, and to identify whether the driver is engaged or distracted
Some question whether emotion recognition software is effective, and whether its use is ethical. One research study recently summarized in Sage journals is examining the assumption that facial expressions are a reliable indicator of emotional state.
“How people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation,” stated the report, from a team of researchers led by Lisa Feldman Barrett, of Northeastern University, Mass General Hospital and Harvard Medical School.
The research team is suggesting that further study is needed. “Our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life,” the report stated.
Technology companies are spending millions on projects to read emotions from faces. “A more accurate description, however, is that such technology detects facial movements, not emotional expressions,” the report authors stated.
Affectiva to be Acquired by $73.5 Million by Smart Eye of Sweden
Recent beneficiaries of the popularity of emotion recognition software are the founders of Affectiva, which recently reached an agreement to be acquired by Smart Eye, a Swedish company providing driver monitoring systems for about a dozen automakers, for $73.5 million in cash and stock.
Affectiva was spun out of MIT in 2009 by founders Rana el Kaliouby, who had been CEO, and Rosalind Picard, who is head of the Affective Computing group at MIT. Kaliouby authored a book about her experience founding Affectiva in the book, Girl Decoded.
“As we watched the driver monitoring system category evolve into Interior Sensing, monitoring the whole cabin, we quickly recognized Affectiva as a major player to watch.” stated Martin Krantz, CEO and founder of Smart Eye, in a press release. “Affectiva’s pioneering work in establishing the field of Emotion AI has served as a powerful platform for bringing this technology to market at scale,“ he stated.
Affectiva CEO Kaliouby stated, “Not only are our technologies very complementary, so are our values, our teams, our culture, and perhaps most importantly, our vision for the future.”
Some have called for government regulation of emotion intelligence software. Kate Crawford, senior principal research at Microsoft Research New York, and author of the book Atlas of AI (Yale, 2021), wrote recently in Nature, “We can no longer allow emotion-recognition technologies to go unregulated. It is time for legislative protection from unproven uses of these tools in all domains—education, health care, employment, and criminal justice.”
The reason is, companies are selling software that affects the opportunities available to individuals, “without clearly documented, independently-audited evidence of effectiveness,” Crawford stated. This includes job applicants being judged on facial expressions or vocal tones, and students flagged at school because their faces may seem angry.
The science behind emotion recognition is increasingly being questioned. A review of 1,000 studies found the science behind tying facial expressions to emotions is not universal, according to a recent account in OneZero. The researchers found people made the expected facial expression to match their emotional state only 20% to 30% of the time.
Startups including Find Solution AI base their emotion recognition technology on the work of Paul Ekman, a psychologist who published on the similarities between facial expressions around the world, popularizing the notion of “seven universal emotions.”
The work has been challenged in the real world. A TSA program that trained agents to spot terrorists using Ekman’s work found little scientific basis, did not result in arrests, and fueled racial profiling, according to filings from the Government Accountability Office and the ACLU.
Dr. Barrett’s team of researchers concluded, “The scientific path forward begins with the explicit acknowledgment that we know much less about emotional expressions and emotion perception than we thought we did.”