Protesters in New York criticizing Amazon’s ties to ICE Photo: Kevin Hagen (Getty)
Most people familiar with face-scanning software suite Amazon Rekognition, its place in the surveillance state, and its questionable efficacy are apt to fear what the consequences of such a technology might be when deployed against civilians. As the company was delighted to point out, fear is also also the most recent emotion Rekognition is now able to detect.
In an otherwise short and unremarkable press release posted on the company’s Amazon Web Services site yesterday regarding “accuracy and functionality improvements” to Rekognition, Amazon wrote (emphasis ours):
Face analysis generates metadata about detected faces in the form of gender, age range, emotions, attributes such as ‘Smile’, face pose, face image quality and face landmarks. With this release, we have further improved the accuracy of gender identification. In addition, we have improved accuracy for emotion detection (for all 7 emotions: ‘Happy’, ‘Sad’, ‘Angry’, ‘Surprised’, ‘Disgusted’, ‘Calm’ and ‘Confused’) and added a new emotion: ‘Fear’. Lastly, we have improved age range estimation accuracy; you also get narrower age ranges across most age groups.
What about Rekognition might have you afraid, possibly in a way that’s visually identifiable to a piece of software which can then compartmentalize, categorize, and improve itself to better detect similar fear states in the grim days to come?
- Rekognition misidentified 28 members of Congress as criminals
- It also appears to disproportionately misidentify women and people of color
- Despite that, it’s already been put into use by police
- Amazon does not and cannot ensure police are using the tools according to best practices
- Rekognition is so invasive even some police departments worried using it would present “a Big Brother vibe”
- Amazon has so far refused to confirm or deny whether Rekognition is in use by U.S. Immigration and Customs Enforcement—the Homeland Security arm most closely associated with the ongoing policy of placing undocumented peoples in concentration camps—but it has verifiably pitched the agency on using this software
- Smart doorbell product Ring, which Amazon owns, filed patent applications to incorporate facial recognition into those devices, which became public last December
Are you worried yet? Please muster your best approximation of terror, as the quality of the training data is really what makes or breaks the whole thing, guys.
When asked what possible use case could justify having Rekognition verify human fear, Amazon spokesperson Jesse Freund responded that criminal, human trafficking, and missing children cases could all be potential use cases. When I pressed for more, Freund added to the list improved physical security, moderating offensive online imagery, and limiting the human bias inherent in policing (lol???). Inexplicably, Freund closed by suggesting that it could be fun to use Rekognition on people at amusement parks.
Whatever that means, the desired effect has been achieved: I’m afraid now.