Privacy SOS

We just got one step closer to understanding black box algorithms

In the digital 21st century, large tech companies shape the decisions we make and the information we consume, in ways that are impossible for the ordinary user to see or understand. The algorithms that determine which information will be shown to which people, at which times, are proprietary trade secrets. The public must rely on expert researchers to inform the rest of us when these systems are working in discriminatory ways. But an obsolete federal law is standing in the way of that research. The ACLU has sued to ensure researchers and the public can investigate ‘black box’ algorithms online, to ferret out information about discrimination that may be happening right in front of our eyes, but is camouflaged as neutral online content.

The federal government has tried to get this lawsuit dismissed, but on March 30th, federal judge John Bates ruled that the ACLU can move forward with the suit, which challenges a section of the Computer Fraud Abuse Act (CFAA). The overbroad 1986 law criminalizes using a computer in any way that “exceeds authorized access,” and in the past, the government has interpreted this language to mean that violating a website’s terms of service is a federal crime. (Have you ever read the terms of service on a website you’ve visited?) Because websites set their own terms, some of the more nuanced prohibitions vary from site to site. On some websites, things like making multiple user accounts, web scraping, and giving fake personal information are terms of service violations. In Sandvig v. Sessions, the ACLU argues that criminalization of this kind of online activity violates the First Amendment.

The criminalization of these activities particularly harms researchers, like the plaintiffs in this lawsuit, who examine discrimination in online advertising. These studies are becoming increasingly important in a world mediated by black box algorithms maintained by large tech companies. Making fake accounts and tracking the way sites respond to them is one of the few ways to gauge whether a website like Google is discriminating against users.

The findings from these studies can be extremely significant. For example, one such study revealed that men were more likely to receive Google ads for high-paying jobs. Another uncovered that Facebook allowed advertisers to exclude people from seeing their housing advertisements based on race, targeting ads only to white people. To continue these important investigations, researchers must be able to collect data by making fake accounts and scraping websites—without fear of arrest or prosecution. Under current law, an overzealous federal prosecutor may deem that type of research illegal. The ACLU’s suit argues that criminalizing these methods of data collection “violates the First Amendment because it limits everyone, including academics and journalists, from gathering the publicly available information necessary to understand and speak about online discrimination.”

Northeastern University professors Alan Mislove and Christo Wilson, the ACLU’s plaintiffs, designed a study to test for discrimination in popular hiring websites. To conduct the study, they need to create fake profiles. Their work has the potential to reveal biases that reinforce systems of inequality and injustice, but they currently cannot realize that potential without fear of prosecution.

In his decision to allow the ACLU’s suit to proceed, Judge Bates affirmed “the CFAA threatens to burden a great deal of expressive activity, even on publicly accessible websites—which brings the First Amendment into play.” That’s a good sign. Professors Mislove and Wilson, as well as all other researchers interested in unpacking black box discrimination, should not be prohibited from uncovering discriminatory code because of websites’ terms of service. Let’s hope the federal courts ultimately agree.

This blog post was co-authored by Kade Crockford and Iqra Asghar.

© 2024 ACLU of Massachusetts.