Amazon Rekognition Matched Lawmakers As Mugshots, Congress Demands Answer

Updated on

Amazon Rekognition – a facial recognition tool – is often touted as simple and easy to use, and is also marketed to U.S. law enforcement agencies as an effective verification tool. However, the American Civil Liberties Union, in a study, found that Amazon’s facial recognition software is not as good as it is claimed to be.

Congress members unhappy

On Thursday, the ACLU released a study claiming that Amazon’s facial recognition tool confused the photos of 28 Congress members with that of criminals. For the study, ACLU used Amazon’s facial recognition tool to scan the faces of the 535 members of Congress against 25,000 public mugshots.

“Congress must take these threats seriously, hit the brakes, and enact a moratorium on law enforcement use of face recognition,” ACLU said.

Following ACLU’s claims on Amazon Rekognition facial recognition, several Democratic members of Congress penned strongly worded letters to Amazon CEO, Jeff Bezos. Civil rights leader Rep. John Lewis was one of the public figures wrongly matched in the ACLU testing.

“The results of the ACLU’s test of Amazon’s ‘Rekognition’ software are deeply troubling,” Lewis said in a letter. “As a society, we need technology to help resolve human problems, not to add to the mountain of injustices presently facing people of color in this country.”

Congress members Sen. Ed Markey, Reps. Luis Gutiérrez and Mark DeSaulnier also wrote a letter to Bezos questioning the technology.

Amazon Rekognition – arguments for and against

Amazon, on its end, defended its Rekognition technology and disputed the methods used by ALCU. In a statement to TechCrunch, the company said, “[W]e think that the results could probably be improved by following best practices around setting the confidence thresholds (this is the percentage likelihood that Rekognition found a match) used in the test.”

Amazon noted that 80% confidence level is an acceptable limit for identifying the photos of hot dogs, chairs or animals, but not for individuals. The company suggests a confidence level of 95% or higher when using facial recognition for law enforcement activities.

ACLU, however, pointed out that for its study it used the default settings which had been set by Amazon itself for the Rekognition facial recognition. Even on its website, the company references the 80% confidence metric for identifying faces. As per ACLU’s attorney Jacob Snow, the response from Amazon can be seen as acceptance from the company that its technology does not work well in the default mode.

“Essentially, they are saying their product is broken out of the box,” Snow said.

Also, Snow noted that there is no evidence that Amazon is instructing law enforcement agencies to set a higher threshold. And, even if the company is making such recommendations, there is no guarantee that the authorities are actually following it.

Racial bias – a common issue with facial recognition software

Further, the ACLU study also found that the tool is plagued with racial bias. According to Snow, about 40% of the mismatches by Amazon Rekognition were of people of color even though only 20% of the members of Congress are people of color. ACLU’s results are in line with NIST’s Facial Recognition Vendor Test, which consistently illustrated facial recognition error rates on women and African Americans.

“People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that,” Snow said.

Difficulty in identifying darker skin tones is a common problem with facial recognition technology. Earlier this year, MIT Media Lab’s Joy Buolamwini and Microsoft’s Timnit Gebru found that facial recognition software from IBM, Microsoft, and Face++ found it harder to identify gender in people of color than in white people. The same researchers also found a similar built-in bias with Amazon Rekognition.

“I join the chorus of dissent in calling Amazon to stop equipping law enforcement with facial analysis technology,” Buolamwini said in a letter to Amazon CEO Jeff Bezos.

Privacy advocates have long been arguing that law enforcement’s use of facial recognition technology must be heavily restricted until the technology has been corrected for its accuracy and racial bias. Even after these corrective measures, the use of facial recognition software must be limited and the scope clearly defined.

Despite this, Rekognition facial recognition is being actively used in Oregon’s Washington County, and the Florida police department recently started testing Amazon’s tool internally. Amazon bundles its facial recognition tool under Amazon’s Web Services cloud offering. The software is quite cheap, often costing about $12 a month for the entire department.

Leave a Comment