Is Amazon’s Face-Detection Service Racially Biased? This Study Says Yes

Videos on YouTube revealed Amazon Rekognition even misclassified famous African-American women, like Michelle Obama, as men.

Is Amazon’s Face-Detection Service Racially Biased? This Study Says Yes

The service labeled darker-skinned women as men 31 percent of the time.

Amazon’s facial-detection technology, Amazon Rekognition, has been misidentifying women, especially those with darker skin, according to a team of researchers at MIT and the University of Toronto.

Over the last two years, the service was being marketed to law enforcement personnel as a way to identify objects, people, text, scenes and activities, as well as detect inappropriate content, according to Amazon.

The company said the facial analysis would be able to determine things like age range, facial hair, emotions and more.

Researchers found that the technology had trouble differentiating the gender of female faces and darker-skinned faces in photos, reports The New York Times.  The service labeled darker-skinned women as men 31 percent of the time. Lighter skin women were misidentified seven percent of the time.

When it came to lighter-skinned men, however, the service made zero errors.

Because of this racial bias and discrimination against minorities, privacy and civil rights advocates, like the ACLU, have demanded Amazon to cease the marketing of Amazon Rekognition technology and selling it to police.  Investors in the company have also asked marketing to stop to avoid potential lawsuits.

Using artificial intelligence (AI) for security and surveillance has its advantages, like identifying faces in crowds or age detection. These advantages could be particularly useful in helping law enforcement catch criminals or find missing children.

The latest debate, however, is if this crosses the line when it comes to a person’s privacy, and whether and how Congress should regulate such powerful technologies.

The ACLU says the technology “is primed for abuse in the hands of governments, poses a grave threat to communities already unjustly targeted in the current political climate, and undermines public trust in Amazon.”

The new study, which will be presented at an artificial intelligence and ethics conference, warns of potential abuse and threats to privacy from the facial-detection technology as well.

Matt Wood, the general manager of AI with Amazon, says the study only focused on facial analysis, a technology that can spot features such as mustaches or expressions such as smiles, and not facial recognition, a technology that can match faces in photos or video stills to identify individuals. Wood says Amazon markets both services.

“It’s not possible to draw a conclusion on the accuracy of facial recognition for any use case — including law enforcement — based on results obtained using facial analysis,” Dr. Wood said in a statement. He added researchers did not test the latest version of Rekognition.

According to the study, Microsoft’s facial recognition technology misclassified darker-skinned women as men one in five times, but Amazon seems to be catching most of the criticism.

One reason could be that Amazon has been less willing to talk about the concerns that have been voiced about their products, especially when compared to other tech companies like Microsoft and IBM.

California Representative Jimmy Gomez has been investigating Amazon’s facial recognition practices and believes the company needs to publicly address these issues.

“I also want to know if law enforcement is using it in ways that violate civil liberties, and what — if any — protections Amazon has built into the technology to protect the rights of our constituents,” he said.

Amazon responded to Gomez by saying all Rekognition customers must follow the company’s policies on civil rights and other laws. However, Amazon does not audit its customers, making it difficult to know for sure how the product is being used.

Wood confirmed that Amazon has updated its technology since the study, re-tested it and found “zero false-positive matches,” according to ABC News.

The website also credits Rekognition for helping the Washington County Sheriff Office speed up the process of identifying suspects from thousands of photo records, and ultimately, catching a criminal.

About the Author

Katie Malafronte
Contact:

Katie Malafronte is Campus Safety's Web Editor. She graduated from the University of Rhode Island in 2017 with a Bachelor's Degree in Communication Studies and a minor in Writing & Rhetoric. Katie has been CS's Web Editor since 2018.

Read More Articles Like This… With A FREE Subscription

Campus Safety magazine is another great resource for public safety, security and emergency management professionals. It covers all aspects of campus safety, including access control, video surveillance, mass notification and security staff practices. Whether you work in K-12, higher ed, a hospital or corporation, Campus Safety magazine is here to help you do your job better!

Get your free subscription today!


Leave a Reply

Your email address will not be published. Required fields are marked *

Get Our Newsletters
Campus Safety HQ