Amazon this week said that it's Rekognition facial recognition software can now detect a person's fear, according to CNBC.
As one of several Amazon Web Services (AWS) cloud services, Rekognition can be used for facial analysis or sentiment analysis by identifying different expressions and predicting emotions based on images of people's faces. The system uses AI to 'learn' as it compiles data.
The tech giant revealed updates to the controversial tool on Monday that include improving the accuracy and functionality of its face analysis features such as identifying gender, emotions and age range.
“With this release, we have further improved the accuracy of gender identification,” Amazon said in a blog post. “In addition, we have improved accuracy for emotion detection (for all 7 emotions: ‘Happy’, ‘Sad’, ‘Angry’, ‘Surprised’, ‘Disgusted’, ‘Calm’ and ‘Confused’) and added a new emotion: ‘Fear.’” -CNBC
The robots will now be able to prioritize who they kill first based on our emotional reaction to them.— Lord Single Malt (@Singlemaltfiend) August 14, 2019
I can't possibly see a way this ends badly. https://t.co/KKV6B0APUR
AI researchers at Microsoft, Kairos, Affectiva and others have spent considerable time and resources trying to read a person's emotions based on their facial expressions, movements, voice and other factors.
That said, some experts have noted that people react and communicate differently based on culture and situation - which means that similar facial expressions and movements can convey more than one category of emotions. As such, researchers have warned "it is not possible to confidently infer happiness from a smile, anger from a scowl, or sadness from a frown, as much of current technology tries to do when applying what are mistakenly believed to be scientific facts," according to the report.
CNBC notes that Rekognition has faced criticism for its use by law enforcement agencies, as well as a reported pitch to Immigration and Customs Enforcement, and that it has been used by organizations that work with law enforcement.