Recognize Me If You Can: Can Facial Recognition Systems Really Tell How We Feel?
Facial recognition systems are becoming more and more widespread in different fields. According to a forecast by MarketsandMarkets, the associated market will have grown up to seven billion dollars by 2024. There is a lot of news in the media reporting that this technology helps catch criminals, define whether shops’ customers are satisfied, and even play the music they like. However, some research indicates that AI still experiences problems with recognizing people’s real emotions.
High hopes for facial recognition
The popularity of facial recognition services continues to grow with every year, Apple’s Face ID being one of the biggest recent examples. According to its developers, this technology can recognize faces even if the user is wearing a hat, scarf, contact lenses, or shades. What’s more, it can work both indoors and outdoors and even in complete darkness.
On the whole, a June 2019 study by the MarketsandMarkets research company states that by 2024 the global facial recognition market can well amount to seven billion dollars given the cumulative annual growth in 2019-2024 amounts to 16%. In 2019, this market is estimated at 3.2 billion dollars. According to experts, the two key factors of such growth are surveillance in the public sector and numerous applications in other segments of the market.
Security. Facial recognition is used when issuing documents of identification, oftentimes in combination with other biometric technologies. Today, comparing people’s faces with the photos in their biometric passports at border customs is no longer a novelty.
There are also many reports about apprehending criminals with the help of facial recognition technologies when they were detected among thousands of unsuspecting people. In 2017, the Chinese police apprehended a wanted person after surveillance recognized him at a concert among a crowd of 70,000 other people.
In the UK, such services have been used since 2016. This May, the London Policing Ethics Panel (an independent panel set up by the Mayor of London) released a 66-page long report dedicated to the application of facial recognition technologies. 56% of their poll’s respondents state that they trust the police and are sure that they will use the technology in accordance with the law, and 83% state that they are sure that the further use of this technology is inevitable. The panel itself, meanwhile, recommended the police to remember that the social benefit of a technology has to outweigh the public’s distrust in it.
According to the Russian media, facial recognition technology showed great results during the 2018 FIFA World Cup. Among other things, it helped arrest over 100 perpetrators, solve the theft of the sponsor’s trophy and prevent a scramble in one of the fan zones. The devices used the FaceT and FindFace technologies by NtechLab.
Facial recognition is also used to track wanted criminals. A system checks with the police database, scans the images and, if the cameras find a match, sends notifications to police officers’ smartphones. The first 1,500 cameras to make use of facial recognition technology were tested in Moscow in 2017, and in 2018, their amount reached 7,000. By the end of 2019, the Moscow government plans to launch a contest on creating a new facial recognition system that will connect to over 200,000 of the city’s cameras.
Shopping. Retailers have also been showing interest in facial recognition technology in the past several years. For one, the Jardine One Solution (JOS) tech company from Hong Kong notes that many chain store owners use facial recognition technologies to collect data on their customers.
According to JOS group managing director Mark Lunt, stores collect data on the number of customers, their age, sex and ethnic background in order to better assess attendance and develop apps that suit their clients better. What’s more, some even use such data to choose music that is played at their venues.
It was several years back that Walmart, another major retailer, reported that it had already developed technologies that use facial recognition to tell whether a customer is satisfied with their visit to the store and if not, contact them in order to solve any arising issues. The company’s representatives note that a client’s biometrical data can be checked against their transactions, which allows the company to detect a change in a customer’s buying patterns that results from their discontent.
Other fields. In March 2019, the Shenzhen Metro launched a payment system that makes use of facial recognition technology. The system was launched at the Futian station and involved a pilot 5G network. Instead of using a ticket or scanning a QR code on a smartphone, passengers only have to scan their face at a small display located near the station entrance. The system recognizes the person and automatically deducts the fare from their account. Passengers can then check the information about themselves that includes their photographs, sex, age, and time spent on the subway.
And in Australia, facial recognition has even been applied in education. On May 20, 2019, the NEC company announced the launch of a facial recognition system at educational establishments that aims to prevent cheating at exams. The solution was introduced at educational centers for medical specialists who want to work at English-speaking hospitals. The company’s representatives note that sometimes, more-experienced people come to exams in place of students who have to take them. The authorities sought to solve this problem by using facial recognition software that identifies people not just before the exam but during it as well in order to bring the possibility of cheating to a minimum.
According to a recent report by the USA’s National Institute of Standards and Technology, the precision of facial recognition technologies in 2013-2018 has greatly increased in comparison with what it was in 2010-2013. In other words, most algorithms developed in 2018 are better than their most precise counterparts from the end of 2013.
Then again, according to The Verge, AI still has a long way to go in what has to do with solving the more complex tasks that are associated with recognizing emotions. Algorithms that aim to tell how people really feel are currently being developed by such tech giants as Microsoft, IBM and Amazon, as well as many other companies. There are many potential applications for such technologies: from developing automated surveillance systems to weeding out uninterested applicants during job interviews.
The multinational market research firm Kantar Millward Brown uses a technology developed by the Affectiva company to assess people’s reactions to TV advertising. The software records videos of people’s faces and then analyses their facial expressions in order to identify their emotional state. According to Graham Page, Managing Director of Offer and Innovation at Kantar Millward Brown, analysts often do polls, but studying facial expressions offers more particulars. He states that this approach makes it possible to tell which part of an ad really resonates with the audience.
Some startups also propose using “emotional recognition” in the field of security. For example, representatives of the British company WeSee state that AI can detect suspicious behavior by tracking patterns that are unnoticeable with the naked eye.
Nevertheless, psychologists note that this issue is much more complex than it seems, and there’s no solid scientific evidence of software being able to identify a person’s emotional state based on their face. A recent study conducted for the American Association for Psychological Science states that emotions can be expressed by a myriad different means. This makes coming up with a credible assessment of one’s condition based on facial movements only a much more difficult a task (you can read the full report here).
Among other things, the report’s authors underline that research that shows a strong correlation between particular facial expressions and emotional states is often methodologically incorrect. For one, some of it involves actors who deliberately show particular states as terms of reference. Also, when subjects are asked to mark particular emotions, they are often offered to pick from a limited list of options which prompts them to come to a consensus.
“People intuitively understand that emotions are more complex, comments Lisa Feldman Barrett, Professor of Psychology at Northeastern University (USA). Sometimes you scream in anger, sometimes you cry, sometimes you laugh, and sometimes you stay silent and plan the demise of your foes. Just think about it: when was the last time that someone got an Oscar for frowning in anger? No one would consider that great acting.”
She adds that such particulars are rarely recognized by companies that sell solutions for analyzing emotions. Therefore, researchers make the conclusion that, in a sense, companies that make use of AI for assessing human emotions put their customers on a false track. Still, Prof. Barrett does not exclude the possibility of precise emotional recognition becoming possible in the future, though it would have to make use of much more complex metrics.