Monday , June 14 2021

Amazon face detection technology shows sex, racial bias, researchers say



Amazon's face detection technology for law enforcement authorities often misrepresents women, especially those with darker skin, according to researchers at MIT and Toronto University. Privacy and rights defenders have called on Amazon to stop selling your Rekognition service because of discrimination against minorities.

Some Amazon investors have also asked the company to stop for fear that this makes Amazon vulnerable to lawsuits.

Researchers say that in their tests, Amazon technology called women with darker skin as men 31% of time. Women with lighter skin were misidentified 7% of the time. Men with darker skin had an error rate of 1%, while men with lighter skin had nothing.

Artificial Intelligence can mimic the bias of their human creators as they enter the everyday life. The new study, published late Thursday, warns of the potential for abuse and threats to privacy and civil liberties from face detection technology.

Amazon protest
Demonstrators hold the images of Amazon CEO Jeff Bezos near their faces at a Halloween-themed protest at Amazon headquarters for Rekognition's Face Recognition System, Oct. 31, 2018, in Seattle.

Elaine Thompson / AP


Matt Wood, general manager of artificial intelligence with Amazon's cloud computing unit, said the study uses "face analysis," not face recognition technology. The tree said that face analysis "can spot individuals in videos or images and attribute common attributes such as wearing glasses, recognition is a different technique by which an individual person adjusts to faces in videos and images."

On Friday, published on the Medium website, MIT Media Lab researcher Joy Buolamvuni replied that companies must check all systems that analyze human faces for bias.

"If you sell a system that has been shown to have bias in terms of human faces, it is doubtful that your other person-based products are also completely impartial," she writes.

Amazon's reaction shows that it does not take seriously the serious concerns raised by this study, says Jacob Snow, a lawyer at the American Civil Liberties Union.

Buolamwini and Inioluwa Deborah Raji of the University of Toronto said they were studying Amazon technology because the company sold it to law enforcement. Rajji's account at LinkedIn says it is currently being investigated for Google's artificial intelligence, which is competing with Amazon in providing cloud computing services.

Buolamwini and Raji say that Microsoft and IBM have improved their face detection technology as the researchers found similar issues in the study in May 2017. Their second study, which included Amazon, was done in August 2018. Their document will is being presented Monday at an artificial intelligence conference in Honolulu.

Wood said that Amazon had updated his technology after the study and made his own analysis with "zero false positive matches".

Amazon's website has credited Rekognition to help Washington's sheriff's office in Oregon speed up the time it takes to identify suspects from hundreds of thousands of photos.


Source link