(Bloomberg) -- Amazon.com Inc. drew the ire of the American Civil Liberties Union on Tuesday over a facial-recognition system offered to law-enforcement agencies that the advocacy group says can be used to violate civil rights.
In marketing materials obtained by the group, Amazon Web Services said its Rekognition system uses artificial intelligence to quickly identify people in photos and videos, enabling law enforcement to track individuals.
"Amazon’s Rekognition raises profound civil liberties and civil rights concerns," the group said in a statement. "Today, the ACLU and a coalition of civil rights organizations demanded that Amazon stop allowing governments to use Rekognition."
Law enforcement agencies in Florida and Oregon are using the service for surveillance, according to the ACLU. The group used public-records requests to learn about the service.
Government use of facial-recognition software has raised concerns among civil rights groups that maintain it can be used to quiet dissent and target groups such as undocumented immigrants and black rights activists.
The Washington County Sheriff’s Department just outside Portland, Oregon, meanwhile, thinks the software is a great deal. The department pays Amazon between $6 and $12 a month to quickly cross-reference security camera footage and other images with 300,000 jail-booking photos to help identify suspects in criminal investigations, Deputy Jeff Talbot said. The software isn’t used for real-time or mass surveillance, he said.
"We absolutely find $6 to $12 to be a smart investment in our community’s safety," Talbot said. "The response from our constituents has been positive. They’re glad we’re staying on the cutting edge of technology."
Some AI software that’s used for facial recognition has been shown to be racially biased because it was trained using images with relatively few minorities included. In an infamous example from 2015, Google’s AI-powered photo-tagging system classified some black people as gorillas. In a paper published this year, researchers at MIT and Microsoft Corp. found that facial-recognition systems are far less accurate at identifying non-white people and women than white men.
Amazon maintains law enforcement is just one application of the technology, which can also be used to find abducted people, and by amusement parks to track down lost children. The internet giant also noted that the recent British royal wedding used Rekognition to identify attendees.
"When we find that AWS services are being abused by a customer, we suspend that customer’s right to use our services," Amazon said in an emailed statement. "We require our customers to comply with the law and be responsible when using Amazon Rekognition."
Oregon’s Washington County sheriff’s office wants to use the system to scan some 300,000 booking photos from its jail that it has compiled since 2001, according to records obtained by the ACLU.
A marketing presentation by Amazon’s cloud-computing business indicated the Rekognition system can slash the time it takes to identify individuals in photos and video surveillance. The company’s technology does it in minutes versus days when images are sent to different law-enforcement agencies for manual review, according to the marketing documents obtained by the ACLU.
In one email exchange last year, an Oregon law enforcement officer asked if the product could be enhanced to automatically tag inmate booking photos with descriptions of their tattoos. The system already tags a photo of someone with "tattoo," but the officer wanted to know if it could be enhanced to describe the tattoo with "dragon" or "flower" to make the tags more precise.
Law enforcement has made wide use of facial recognition for a range of tasks, from running mug shots against databases of drivers’ license photos to scanning of people walking by surveillance cameras. In 2016, researchers at Georgetown University found that at least five major police departments claimed to run real-time face recognition on footage from street cameras, or expressed interesting in doing so. Almost without exception, the agencies buying facial-recognition technology from private companies didn’t require the vendors to show evidence that it was accurate.
Earlier this year, police in the south of Wales released statistics on its use of facial recognition to match people at public events with databases of people on law enforcement watch lists over a 10-month period. In 15 events, there were five where positive identifications outnumbered false positives -- incidents where someone was mistakenly flagged as suspicious. At one soccer match, the system came up with nearly 2,300 false positives, compared to 173 actual matches. “The past 10 months have been a resounding success in terms of validating the technology,” the department wrote in its report, noting that it hadn’t actually arrested the people who were falsely identified.
Axon Enterprise Inc., the dominant provider of body cameras to U.S. law enforcement agencies, has said one of its primary businesses in the future will be to aggregate and analyze video its devices capture. Acknowledging the potential for abuse, the company this year established an AI ethics board.
A coalition of civil rights groups wrote an open letter to Axon in April, urging the board to disavow certain products that would be “categorically unethical to deploy.” The top technology the groups mentioned was real-time facial recognition of video from body cameras. "Axon is not actively working on facial recognition at this time," Chief Executive Officer Rick Smith said in a statement.
Microsoft and Google have similar facial-recognition technology. The ACLU focused Tuesday’s report on Amazon because the documents are "incredibly alarming and raise serious questions about the plans that other companies have for their own artificial intelligence systems as well," said Nicole Ozer, technology and civil liberties director for the ACLU of California. "It is difficult to overstate Amazon’s influence on whether the threats posed by facial recognition tools are unleashed in American communities."
©2018 Bloomberg L.P.