Updated on June 12 at 12:55 p.m. ET
Amazon announced on Wednesday a one-year moratorium on police use of its facial-recognition technology, yielding to pressure from police-reform advocates and civil rights groups.
It is unclear how many law enforcement agencies in the U.S. deploy Amazon's artificial intelligence tool, but an official with the Washington County Sheriff's Office in Oregon confirmed that it will be suspending its use of Amazon's facial recognition technology.
Researchers have long criticized the technology for producing inaccurate results for people with darker skin. Studies have also shown that the technology can be biased against women and younger people.
IBM said earlier this week that it would quit the facial-recognition business altogether. In a letter to Congress, chief executive Arvind Krishna condemned software that is used "for mass surveillance, racial profiling, violations of basic human rights and freedoms."
And Microsoft President Brad Smith told The Washington Post during a livestream Thursday morning that his company has not been selling its technology to law enforcement. Smith said he has no plans to until there is a national law.
Congressional Democrats are seeking to regulate the technology in sweeping police reform legislation inspired by the nationwide protests over the killing of George Floyd, an unarmed black man who died in Minneapolis after a white police officer knelt on his neck for nearly nine minutes.
The proposed bill would limit how much federal law enforcement officials could use facial recognition technology, including a ban on using the software with police body-worn cameras.
In its statement, Amazon officials say the company supports federal regulation for its algorithm-driven facial recognition software, known as Rekognition.
"We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested," the company said in a statement.
Amazon noted that authorities will still be able to use the facial recognition technology to help rescue human trafficking victims and reunite missing children with their families.
Rekognition is part of Amazon Web Services, the tech giant's cloud computing division. It can use machine learning to rapidly compare an image captured from a person's social media account or from an officer's smartphone to look for a match from a database of hundreds of thousands of mugshots. Critics have been wary that using an algorithm to confirm who someone is can lead to cases of mistaken identity.
Nicole Ozer, technology and civil liberties director with the ACLU of Northern California, said a blanket ban on the technology is needed, but she welcomed Amazon's one-year pause, saying it shows that the company is "finally recognizing the dangers face recognition poses to black and brown communities and civil rights more broadly."
Ozer added: "Face recognition technology gives governments the unprecedented power to spy on us wherever we go. It fuels police abuse. This surveillance technology must be stopped."
Fight for the Future, a digital rights advocacy group, is also calling for an outright national ban on facial recognition technology and says Amazon's one-year break appears strategic.
"They've been calling for the federal government to 'regulate' facial recognition, because they want their corporate lawyers to help write the legislation, to ensure that it's friendly to their surveillance capitalist business model," the group said. "The reality is that facial recognition technology is too dangerous to be used at all."
American intelligence and military officials have long used facial recognition software in overseas anti-terrorist operations, but local and federal law enforcement agencies inside the U.S. have increasingly turned to the software as a crime-fighting tool. Immigration and Customs Enforcement has used the technology to scan millions of driver's licenses for possible matches.
One study by the Massachusetts Institute of Technology demonstrated that while men with lighter skin were often almost always positively identified, about 7% of women with lighter skin were misidentified and up to 35% of women with darker skin were falsely identified.
"With IBM's decision and Amazon's recent announcement, the efforts of so many civil liberties organizations, activists, shareholders, employees and researchers to end harmful use of facial recognition are gaining even more momentum," said Joy Buolamwini, who led the MIT study and founded the Algorithmic Justice League, which is calling for a nationwide moratorium on all government use of facial recognition technologies. "The first step is to press pause."
NOEL KING, HOST:
Amazon says it will temporarily stop giving law enforcement agencies access to its facial recognition software. Earlier this week, IBM said it would stop its work on this type of technology. NPR's Bobby Allyn has been reporting on all of this. And before we get started, I will note that NPR does get financial support from Amazon.
Bobby, good morning.
BOBBY ALLYN, BYLINE: Good morning.
KING: So Amazon is the place that a lot of us get a lot of our stuff, but it also has a more low-key, though thriving, business providing facial recognition software to law enforcement. Tell us about that business.
ALLYN: Sure. It's called Rekognition. Amazon spells that with a K. And in recent years, they have been selling it to police departments, and police departments have been using it to identify potential suspects by checking photos against a giant database of mug shots that are stored digitally in the cloud.
But then a bunch of studies took a really hard look at this tool and found a lot of problems. So this facial recognition tool has a hard time correctly identifying people with darker skin, harder time identifying women and younger people. One audit even found when members of Congress were ran through this Amazon tool that nearly two dozen of them were matched to people who had been arrested for committing crimes. So those were false matches, yeah.
So I talked to a MIT researcher. Her name is Joy Buolamwini. And she has been studying Amazon's technology, and here's her assessment.
JOY BUOLAMWINI: Racial biases, gender bias, even has age bias - but even if that bias wasn't there, there's still the capacity for abuse. And most concerningly, especially as we're seeing more people take to the streets, is the specter of using facial recognition for surveillance.
ALLYN: So as Buolamwini says, the protests across the country have created some intense concern about the algorithms here. And you know, that fear about mass surveillance this week drove IBM to abandon its facial recognition line of business completely.
KING: OK, I do understand the concern about mass surveillance. How are police departments actually using it?
ALLYN: Yeah, so this is how it works. So say you get pulled over, and a deputy comes up to you. The deputy can take your photo with his phone and upload that into this system. And the system will scan hundreds of thousands of mug shots and see if you have been arrested for a crime in the past several years. They can even do this with someone's social media photo. They can do it with sort of a grainy security camera picture. And police say this is really helpful when a suspect, you know, refuses to give their name or when they're, you know, looking at someone on a crime scene who is unconscious.
But you know, the MIT researcher I talked to, Buolamwini, she's worried about, you know, use of this technology outside of government, about people who can use it for other uses.
KING: You know, Bobby, I wonder - did Amazon come right out and cop to doing this, to making this move, because of the protests over George Floyd's killing and over racial injustice in this country?
ALLYN: You know, Noel, that's certainly the backdrop here. But in Amazon's really brief announcement, they made no mention of George Floyd's name. They did not say anything about the unrest we're seeing. Instead, they said they're halting the facial recognition software in order to allow Congress to develop some regulations. And you know, this creep into what some people call policing by algorithm has caused pushback from even inside of Amazon. Some of Amazon's own investors and employees have real fears about this and have made them known.
KING: But what does it mean for law enforcement agencies that were already using it?
ALLYN: Yeah. So Amazon has never revealed a complete list of all of the law enforcement agencies that use this technology, but I did talk to the sheriff's office in Washington County, Ore. It's the third-largest law enforcement agency in the state. They were using it - using it quite a bit. And they said, since we heard that Amazon is forcing us to stop, we're going to stop. So they're not going to be using it for at least the next year.
KING: OK. But one thing we all know about businesses and competition is that somebody's going to step in and fill the void, right? There's got to be an Amazon competitor out there like, we got it.
ALLYN: Yeah, right. So Microsoft also has similar technology. There's other IT and surveillance companies that offer a similar service. So just because Amazon is halting doesn't mean it's going to go away.
KING: NPR's Bobby Allyn, such interesting reporting. Thanks, Bobby.
ALLYN: Thank you, Noel. Transcript provided by NPR, Copyright NPR.