Amazon is banning the police use of Rekognition, its image analysis software, for a period of one year. The move is seen as a response to the George Floyd and police brutality protests.
Rekognition was introduced in 2016 and described as a “service that makes it easy to add image analysis to your applications” and to “detect objects, scenes, and faces in images.” It’s far more than a tool for facial recognition and can be used by ad agencies and marketing companies for a variety of applications.
The controversy over its use has to do with its suspected bias in being unable to identify some black suspects correctly.
Rekognition has been a controversial topic for Amazon even before the latest spate of nationwide protests against police violence. The technology has been used by law enforcement agencies and was reportedly pitched to Immigration and Customs Enforcement in the U.S. Amazon Web Services has said in the past that the technology is used by organizations that work with law enforcement to advocate for victims of crime.
The Washington County Sheriff Office in Oregon is the only police department that AWS names on its website as a Rekognition customer. Amazon declined to comment on the total number of police departments that use Rekognition.
There are some studies, one by MIT in particular, that appear to show racial and gender bias in the design of the technology. It can misidentify very dark-skinned subjects, generating false positives and false negatives at a significant rate.
Research has indicated that facial recognition software may hold racial and gender bias. Last year Joy Buolamwini, founder of Algorithmic Justice League, testified about research into the subject for Massachusetts Institute of Technology. Her findings helped provide the basis of a shareholder vote that Amazon held last year, with 2.4% of shareholders voting in favor of banning the sale of the technology to government agencies.
That problem may be a matter of tweaking the software. But there are other troubling aspects to the technology that privacy advocates are concerned with.
“People should be free to walk down the street without being watched by the government,” the ACLU said in 2016 when the product hit the market. “By automating mass surveillance, facial recognition systems like Rekognition threaten this freedom, posing a particular threat to communities already unjustly targeted in the current political climate.”
The reason Amazon developed the technology is because of its value to many types of businesses.
In the private sector, the service has been perhaps most enthusiastically adopted by companies in the marketing and advertising space, particularly those that have huge catalogs of photos and videos that they want to categorize and tag for easy searching.
According to case studies provided by Amazon, social media companies use the program to weed out “fake followers” and find “micro-influencers” who have strong followings on social networks and could be used to promote brands. The White House has used Rekognition in an application that lets users take a photo and find which first lady they most closely resemble.
Clearly there is the great value associated with this software, so Amazon is not going to stop selling it entirely. And the ban on its use for one year only extends to law enforcement. Congress has been holding hearings on all kinds of facial recognition software, hoping to establish some guidelines for its ethical use and the protests should make passage of a bill addressing facial recognition easier.
Join the conversation as a VIP Member