It knows you’re afraid. One of the world’s largest corporations is pretty pleased with the latest thing its facial recognition system can do.
The company has announced that its system can detect fear. That’s no comfort to those who are afraid of what part this company is playing in helping police build surveillance networks.
We’ll let you know which company has added fear to the facial recognition mix. Also, why some lawmakers and privacy advocates are not thrilled with the company’s system.
Amazon technology now recognizes fear
It almost seems inevitable. Amazon’s face-scanning software suite Amazon Rekognition has been inspiring fear in many people for a few years.
The company has announced that the system can now recognize fear. While touting the system’s improved accuracy in recognizing emotions, the company said fear had joined happiness, sadness, anger, surprise, disgust, calm and confusion to its creepy emotional list.
So why does Rekognition need to recognize fear? An Amazon spokesperson told Gizmodo that it would be useful in cases of human trafficking and missing children, as well as improving physical security and limiting the human bias inherent in policing.
Bizarrely, the spokesperson wrapped up the spiel by saying how fun it would be to use Rekognition on people at amusement parks. OK.
The fear component certainly isn’t going to do anything to endear Rekognition to people already opposed to it.
Related: Your face might be part of facial recognition databases shared worldwide
Amazon Rekognition and police surveillance
One of Rekognition’s fiercest critics is the American Civil Liberties Union (ACLU). The ACLU is concerned because Amazon has sold its facial recognition technology to local law enforcement agencies and the FBI.
The organization has warned that the facial recognition tool is unreliable and shows a bias toward people of color.
To prove its point, ACLU has used Rekognition to compare photos of members of Congress with criminal mugshots. The ACLU found more than two-dozen false positives.
Of the 28 false positives, six of the images were those from the Congressional Black Caucus. Even though people of color only comprise around 20% of Congress, they made up more than 40% of the false positives.
In January, researchers from the MIT Media Lab published a study stating that Rekognition struggled to correctly identify women of color. Also stoking fears is that Amazon will not identify which and how many local law enforcement agencies it has sold Rekognition to.
And let’s not forget Amazon’s Ring Video Doorbell. Ring has been working for years with police in various areas to help them take advantage of footage shot by the video doorbell.
In just four years, the Ring program reportedly has expanded to more than 225 cities across the nation. The program is alarming advocates for privacy and civil liberties who fear police requests for Ring footage will result in a surveillance network that operates without a formal warrant.