Briefings Magazine

I Can Read Your Mind

How are your remote workers feeling today? Companies will soon have the technology to look for emotional giveaways on screens— but should they use it?

See the latest issue of Briefings at newsstands or read in our new format here.

By: Peter Lauria

The way he tilted his head was the first clue. How his eyes kept drifting toward the ceiling was another. But the dead giveaway that James wasn’t the right executive to lead the integration of the company’s most recent acquisition was his face. More specifically, his lips, and how tightly he held them together in a forced smile. 

But how would anyone know that? In this recreated case, software that uses an algorithm to detect emotion studies a webcam video of James during a post-acquisition meeting. Sure, James said all the right things, but a review of his facial cues, combined with an analysis of his tone of voice and other biometric data, revealed something else. Together, the data suggested he was anxious about his ability to succeed in the role.

The human face has 43 muscles that can be combined into more than 10,000 different expressions, each of them offering subtle, unconscious clues about our feelings and thoughts. Today, a once-nascent field of neuroscience, so-called emotional detection, has grown into a booming corner of the tech industry. And it’s one that the corporate world is quietly taking a keen interest in. Call it high-tech mind reading. 

Yes, it sounds very James Bond-like, but for decades, law enforcement and spy agencies have used software that analyzes facial expressions and looks for emotional giveaways to catch criminals or conduct covert operations. Now, companies are tiptoeing into the act, using the technology mostly for security purposes, but not dismissing the possibility of probing into the minds of employees—particularly those at home. The market for emotion-detection and facial-recognition software is forecast to exceed $100 billion globally by 2030. 

Software providers who sell this technology to the private sector are tight-lipped but insist that the technology isn’t just about discerning whether employees are lying or engaged. Emotion detection, they say, can tip off managers to a worker who is feeling stressed or fatigued by taking photographs that can later be analyzed. “The technology can help predict an employee’s behavior or mood and suggest ways managers can intervene,” says Chris Cantarella, global sector leader for software at Korn Ferry.

The irony, of course, is that the use of such technology itself can cause anxiety and stress. Emotion-detection software is only the latest in a series of what critics view as invasive and less-than-precise monitoring of employees by companies. The new norm of remote work may only spur its use, as firms struggle to figure out how engaged, bored, or tired their staffs are. “There are fewer boundaries now between employer and employee than ever before,” says J.S. Nelson, a visiting researcher with the program on negotiation at Harvard Law School and author of an upcoming book on workplace surveillance. “The question is how much control can the company exert over the person?”

(click image below to enlarge)

The idea that emotions can be revealed through facial expressions has its roots in the late 1960s, in the fieldwork of a famous psychologist, Paul Ekman. Using funds from the US Defense Department, Ekman traveled as far as Papua New Guinea to study tribespeople and prove that some facial expressions were universal. Ultimately, he identified seven: anger, disgust, fear, surprise, happiness, sadness, and contempt. His research drew such widespread attention that everyone from the CIA to the movie studio Pixar—and even the Dalai Lama—would eventually seek him out.

But the real leap forward in facial-expression tech came after the 9/11 attacks, which turned emotion detection into a security imperative. Governments needed a way to identify bad actors and security threats among millions of faces, and as it turned out, the data was there—thanks to the internet and social media. The door was open to the kind of mass adoption that artificial intelligence needed to read faces. “It all desensitized people to using technology to track them,” says Cantarella. “Companies could mine that data.”

Today, this facial data has an ever-growing range of uses in both the public and private sectors. Many companies rely on it for security purposes; it’s also used to monitor building access and, in some cases, to unlock devices. Elke Oberg, marketing manager at facial-recognition software company Cognitec Systems, says one US company with a large base of contingent workers found that people who’d been fired were reapplying—and in some cases getting rehired—under different names. “The company uses our technology to match applicants’ faces with faces of previously fired employees to ensure they don’t hire them again,” says Oberg. 

Anyone who has ever called customer service is familiar with the “this call is being monitored” automated message. Less well-known is the fact that these recordings are reviewed not just for training purposes, but also to evaluate the emotional condition of both the employee and caller. For instance, Cogito, a developer of emotional and conversational AI, provides software that analyzes voice signals and gives real-time feedback to call-center representatives on how to adjust their behavior. It can detect extreme frustration and alert managers when they should intervene. 

Cantarella says he knows of companies that use emotion-detection technology as part of an overall assessment of an employee’s engagement and productivity. He says firms use AI algorithms to detect boredom or disinterest from expressions on Zoom calls, for instance. They can combine that data with other tracking activity—such as staffers reaching out on LinkedIn more, or using specific words in emails or chats that might signify unhappiness—to alert firms if it appears that someone might want to leave. “It’s invasive but it doesn’t have to be insidious,” says Cantarella, noting that employees often don’t realize they have effectively signed waivers consenting to the probing of their work activity.

As more uses have come into play, experts say, the field has fueled a research and investment boom. Part of that comes from the normalization of remote and hybrid work, which has raised a host of concerns among business leaders about productivity and intellectual property. Shaun Moore—founder of a facial-recognition software company later acquired by Pangiam, where he is now chief AI officer—says most companies use the technology to protect their intellectual property by identifying bad actors over social media. But while he doubts there will be any mass adoption of mind reading, the AI will continue to evolve, and he does foresee a day when he’ll “definitely be fielding questions” from companies about emotional detection.

“It’s invasive but it doesn’t have to be insidious.”

Still, how far can mind reading evolve? The field is getting larger and larger, but some critics say there’s no conclusive evidence proving that this sector of AI actually works. For one thing, the algorithms behind it can be gamed: An employee could be suffering from crushing depression, but if they are laughing and smiling, the algorithm is likely to register them as happy. “It is hard to take a snapshot of an emotional state,” says Moore. Cognitec’s Oberg agrees. He says algorithms can guide an observer to the emotion someone is expressing, but it’s the observer who finally has to make a human assessment. “AI can’t tell if the emotion detected is forced or fake. You need a person for that,” says Oberg.

Another obstacle may be legal. While there are no universal laws around privacy or data protection, employers in Europe generally cannot collect biometric data like heart rate or fingerprints without an employee’s consent. “It is basically a way to ensure that facial or emotional AI is used as a last resort,” says Oberg. In the US, by contrast, only a few states have laws outlining the biometric information employers can collect from employees.

What’s more, employees are challenging employers more and more to protect their privacy rights in the remote-work era. Recently, a court ruled in favor of a worker who sued his employer for wrongful termination after he was fired for refusing to keep on his webcam. In its ruling, the court said the webcam mandate violated “the employee’s right to respect for his private life.”

All of this is occurring against the backdrop of the so-called purpose movement in business, which emphasizes trust, not monitoring, of employees. “Employees are questioning more and more what it means to be a worker, and companies need to tread carefully,” says Nelson, the visiting researcher at Harvard.

The irony, she says, is that surveillance technology may end up driving out star performers instead of underperformers—over issues of trust.  “Firms are telling employees they care about them,” she says. “But does the employee really believe that?”

 

Download PDF