Privacy SOS

Data-mining: terrorism prevention or social control?

Please note that by playing this video YouTube and Google will place a long-term cookie on your computer.

You may or may not have heard of the CIA's favorite data analysis company, Palantir, which currently operates out of Facebook's old offices in Palo Alto, California. But you likely have heard something about data mining software more generally; it's supposed to be the silver bullet that solves the data-flood problem for the world's spy agencies, which can't seem to know enough about our every movement, thought, purchase, communication, etc.

Software like Palantir is meant to make sense out of the mass of swirling data that clogs databases at the FBI, CIA, DOD, NYPD, LAPD, and increasingly state and local police fusion centers. Those databases contain intimate information about all of us — and yet the vast majority of us aren't plotting violent schemes, but simply going about our quotidian, daily lives. Palantir and like-programs, the story goes, solve the drowning-in-data problem by "connecting the dots," piecing together seemingly unrelated data points to help intelligence and law enforcement agents distinguish between those people who are planning to bomb something, and those who are not.

Palantir does something besides highlight the supposedly dangerous among us, however. As Businessweek reports in a lengthy piece on the company:

An organization like the CIA or FBI can have thousands of different databases, each with its own quirks: financial records, DNA samples, sound samples, video clips, maps, floor plans, human intelligence reports from all over the world. Gluing all that into a coherent whole can take years. Even if that system comes together, it will struggle to handle different types of data—sales records on a spreadsheet, say, plus video surveillance images. What Palantir (pronounced Pal-an-TEER) does, says Avivah Litan, an analyst at Gartner (IT), is “make it really easy to mine these big data sets.” The company’s software pulls off one of the great computer science feats of the era: It combs through all available databases, identifying related pieces of information, and puts everything together in one place.

"Everything together in one place." Sounds creepy, right? It is. And contrary to claims made by Palantir, the CIA and even the Businessweek piece, it doesn't succeed in preventing terrorism. It can't, because data mining and data analysis programs rely on patterns of suspicious behavior in order to determine who is a 'risk'. But as a Homeland Security funded study showed in 2008, predictive terrorism modeling does not work. Why? There is no particular risk profile for people who are likely to commit heinous acts of violence. And furthermore, those people intent on doing real harm will go out of their way to study the latest law enforcement approach, and work diligently to get around it. 

We've all heard the basic patterns to look out for: paying cash for one way plane tickets; young men traveling alone; buying large quantities of fertilizer far from a farm, etc. But the 9/11 attacks were so successful precisely because they were so unexpected. What makes the CIA think that the next round of spectacular attacks — if indeed it comes — will be anything like what it has seen before? In other words, how do you model for an infinite number of possible approaches? 

You can't. So Palantir won't stop terrorism, full stop. But on the other hand, data-mining software like Palantir is very useful for maintaining social control over people who are not constantly trying to evade the surveillance state, who are simply going about their normal lives under it's ever-watchful eye. 

The ways in which Palantir can be deployed as a tool for social control are seemingly limitless:

Using Palantir technology, the FBI can now instantly compile thorough dossiers on U.S. citizens, tying together surveillance video outside a drugstore with credit-card transactions, cell-phone call records, e-mails, airplane travel records, and Web search information.

If the police want to know what you are doing and where you are going, they can. But towards what end? Can they really discern from your captured images and web reading habits if you are a threat to society? Could Palantir have predicted and therefore stopped Jared Loughner from taking a gun to the shopping mall in Arizona and shooting it up, killing and injuring many? If so, why didn't it?

Even though DHS found in 2008 that data mining used to predict terrorism doesn't work and is too great an assault on personal privacy even if it did, there are even more basic questions we should ask ourselves before we consent to giving up our most basic rights to privacy and personal integrity to the state.

Foremost among these questions is: Can the government keep us safe from all harm at all times? Furthermore, do we want to live in a society wherein we give up all of our privacy, trading our sacred human dignity for (false) promises of personal safety? And if the true aim of the CIA's use of programs like Palantir is public safety, can the government use the technology to prevent car accidents and domestic homicides, which kill tens of thousands more Americans every year than terrorism?

The answers to these questions are obvious. It's time to say 'no' to the culture of fear that promotes the police state ideology.

Democracy has its risks; we either accept them, or we instead accept the rise of the creeping police state. We cannot have both.

© 2024 ACLU of Massachusetts.