Last week, the ACLU of Northern California revealed that Amazon’s Rekognition face surveillance software identified 28 members of Congress as people who have been previously arrested for a crime. The only problem? They were false matches. Unsurprisingly, the 28 members of Congress falsely matched to the mugshot database were disproportionately people of color. While Congress is only 20 percent people of color, they were 40 percent of those falsely matched by Rekognition.
Lawmakers reacted swiftly to the report. Senators Ron Wyden (D-OR), Cory Booker (D-NJ), and Ed Markey (D-MA) fired off a letter to 39 federal law enforcement agencies, among them ICE and the FBI, demanding information about how they use face surveillance technologies. Senator Markey and Representatives Luis Gutiérrez (D-IL) and Mark DeSaulnier (D-CA) also sent a letter to Jeff Bezos himself, asking the billionaire for more details about Rekognition and a list of its government users.
Rekognition’s failure to accurately recognize Congress members comes on the heels of demands from the ACLU, a coalition of 70 civil rights, civil liberties, religious, and community organizations, hundreds of thousands of petition signatories, and Amazon shareholders that Amazon stop selling its face surveillance technology to law enforcement and other government entities. Employees later joined in by publicly asking Bezos to stop selling Rekognition to law enforcement and demanding more transparency about the company’s contracts with police departments and federal agencies like ICE.
Citing concerns that face surveillance technology violates privacy and free speech rights, and will exacerbate racial biases in policing, the ACLU has called for a moratorium on government use.
Despite these demands, the company’s embarrassing public relations disaster last week, and resulting scrutiny from Congress, Amazon has not budged. After the ACLU’s test results went live, Amazon’s head of artificial intelligence Matt Wood blogged about the issue. But instead of admitting error or indicating that the company would stop and reflect on the concerns raised by hundreds of thousands of people nationwide, he quibbled with the ACLU study’s methodology, and didn’t back down from his defense of Amazon’s sales to law enforcement.
But perhaps it isn’t surprising that Amazon took shots at the ACLU’s study instead of addressing community concerns about its sale of face surveillance technology to government actors like police. The company’s core defense of its sales to police has utterly fallen apart.
Back in June, after the coalition letter demanding Amazon stop selling face surveillance to police first went live, Wood wrote a blog post defending the company’s sales to government. His defense rested on the company’s acceptable use policy, which bans the use of Rekognition and other AWS products for illegal activities.
“There has been no reported law enforcement abuse of Amazon Rekognition,” Wood wrote. “We also have an Acceptable Use Policy [] that prohibits the use of our services for ‘[a]ny activities that are illegal, that violate the rights of others, or that may be harmful to others.’ This includes violating anybody’s Constitutional rights relating to the 4th, 5th, and 14th Amendments – essentially any kind of illegal discrimination or violation of due process or privacy right. Customers in violation of our AUP are prevented from using our services.”
That sounds good in theory, but how will Amazon know if its law enforcement customers are using Rekognition to violate people’s rights or break the law? After all, courts have yet to draw clear boundaries outlining the precise protections the Constitution provides in the era of artificial intelligence. And even if we knew what the Supreme Court would say about these matters, companies like Amazon are not well positioned to play cop on the beat when it comes to policing how their customers use their tools.
How, then, might Amazon enforce its acceptable use policy? On July 24, FedScoop published comments from an Amazon vice president, Teresa Carlson, suggesting it doesn’t.
“We are unwaveringly in support of our law enforcement, defense and intelligence community,” Carlson said, defending Amazon’s sale of Rekognition to government agencies. She reportedly told the crowd, gathered at the Aspen Security Forum, that Amazon has not “drawn any red lines” around its sales to government.
“We provide [government agencies] the tools, we don’t provide the solution application that they build,” Carlson said. “And we often don’t know everything they’re actually utilizing the tool for.”
To recap: Amazon executive Matt Wood, the head of artificial intelligence for the company, pushes back against civil libertarian and racial justice concerns about Rekognition by saying Amazon customers must abide by the AWS acceptable use policy, which says customers can’t use Amazon products to break the law. But another Amazon executive, Teresa Carlson, says “we often don’t know everything [government agencies are] actually utilizing the tool for,” meaning it’s not possible, at least right now, for the company to ensure its customers aren’t using its products to violate people’s rights.
The ACLU’s experiment highlights the precariousness of this hands-off approach. As ACLU of Northern California Technology and Civil Liberties Attorney Jacob Snow points out, the misidentifications of the 28 Congress members—again, disproportionately members of color—should not be taken lightly. “An identification — whether accurate or not — could cost people their freedom or even their lives. People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that,” Snow said.
The ACLU, other civil rights organizations, and shareholders have urged Amazon to stop selling Rekognition to government entities. Amazon employees published a letter to Jeff Bezos echoing these demands. And Amazon executive Theresa Carlson said it herself in discussing Rekognition: “[W]e’ve got to make sure as a nation…people should have a voice and tools should always be used ethically.”
Ethical use begins at development and extends through business decisions concerning contracting. It means creating technology that does not produce biased results. It means ensuring that products are not sold to entities which have long histories of racism and abuse. It means acknowledging problems, not side stepping them by offering up vague platitudes or quibbling about confidence rates. If Amazon truly believes its “tools should always be used ethically” the only sensible thing for the company to do is stop selling Rekognition to government entities.
This blog post was co-authored by Iqra Asghar and Kade Crockford.