Please note that by playing this clip YouTube and Google will place a long term cookie on your computer.
In a special report on face recognition, 60 Minutes warns us that we have nowhere to hide — that our anonymous space is approaching non-existence. Framing the problem through a simplistic (and inaccurate) division between corporate and government deployments of the technology, CBS warns us that big business plans to exploit our faces for economic gain, whether we like it or not, while government plans to use the technology to keep us safe.
But all is not equal, the narrative says: according to the bureau, the FBI is bound by strict regulation and needs lots more data to be effective. Unfortunately, CBS repeats the government’s claims as journalistic fact — no matter that there is information in the public record to contradict them.
The narrative doesn't add up
In the segment embedded above, CBS' Leslie Stahl tells us what we already knew about face recognition's role in the Boston bombing investigation: while the FBI had images of both suspects’ faces in US government databases, the “technology did not come up with a match.” Their fingerprints were what did them in, she says.
So what happened? How could this magic technology fail?
“Authorities won’t say what went wrong,” Stahl says, “but one possibility is that government databanks, through which the photos would have been searched, are not big enough. As we discovered, the FBI is working on expanding its database.”
Go back and read the prior two statements again. Authorities had images of both suspects in government databases, we are told. In fact, the brothers are likely in numerous databases: both men are presumably in DHS biometric databases because they are immigrants; Tamerlan is likely in the FBI’s own Criminal Justice Information Services (CJIS) database, because of his 2009 arrest; and Dzhokhar is in the Massachusetts Registry of Motor Vehicles database, because he has a drivers license. So while CBS tells us that “one possibility is that government databanks…are not big enough,” that’s simply not a plausible explanation for face recognition’s failure in this case. They had all the information they needed.
Stahl later speaks with an FBI official who works on the bureau’s Next Generation Identification program, which it boasts will be the biggest biometric database in the world. The NGI project, contracted for a billion dollars to war giant Lockheed Martin, is scheduled to roll-out in 2014, when it will become available to police departments nationwide. The database includes records from the State Department, Homeland Security, and the Department of Defense, which has collected vast quantities of biometric information in Afghanistan and Iraq.
And what of US persons? Stahl asks the FBI official if his agency plans to store faceprints of every American in NGI. “No. Absolutely not. Just people who have been arrested,” the official says, with a totally straight face.
That statement probably makes lots of people feel better, but FBI documents say it isn't true. Those documents suggest the FBI is also storing faceprints and fingerprints of people who apply for civil licenses, those who visit the US or move here from other countries, and everyone who undergoes a federal background check for employment purposes. None of those people fall under the “photographed at time of arrest” category, contrary to the FBI’s claims on 60 Minutes.
According to 60 Minutes and the FBI, the bureau is hamstrung by tight regulations that limit what it can and cannot do with face recognition and our images. Seemingly implying that corporations have a freer hand to track us than does the government, Stahl asks why the FBI can’t just download photos from Facebook and other commercial sources. The official suggests that he couldn’t do that because if he did, he’d have lawyers “lined up outside” his office.
But as EFF’s Jennifer Lynch has highlighted, a 2008 Privacy Impact Assessment of the Next Generation Identification database explains that the database can receive photos or screen shots of people in public places taken by police or private security, and, directly contrary to the FBI’s statements on CBS, even photos available online via social networking sites like Facebook. The FBI is also working with states to manage federal government access to drivers’ license databases through programs like the creepily-named “Project Facemask,” essentially putting anyone with a state identification card into the biometrics mix.
EFF's Lynch writes:
Commercial sites like Facebook that collect data and include facial recognition capabilities could also become a honeypot for the government. The FBI’s 2008 Privacy Impact Assessment stated that the NGI/IAFIS photo database does not collect information from “commercial data aggregators,” however, the PIA acknowledges this information could be collected and added to the database by other NGI users like state and local law enforcement agencies. Further, the FBI's 2010 facial recognition presentation (pdf, p.5) notes another goal of NGI is to “Identify[ ] subjects in public datasets.” If Facebook falls into the FBI’s category of a public dataset, it may have almost as much revealing information as a commercial data aggregator.
As you can see, according to the FBI's own documents, its biometric plans are hardly limited to mugshots, contrary to the bureau’s claims on national television.
The CBS program strongly implies that the government’s hands are tied, that it needs more information, and that its biometrics tracking program is only aimed at “bad guys.” But as the FBI's own documents show, the facts suggest something else.
Corporate biometric tracking
The segment’s take on corporate biometrics tracking is more critical, but ultimately offers no hope that there’s a way out of a dystopian future in which companies know everything about us and track us all day every day.
While CBS claims about government regulations tying the FBI’s hands contradict other FBI statements, the segment is right about one thing: businesses are almost entirely unregulated when it comes to biometrics tracking.
One of the inventors of face recognition technology is concerned about just that. The powers he helped to unleash are having consequences that worry him profoundly. “My faceprint should be my property. My face should be as important as my health records, my financial records. It’s very important to me,” Joseph Attick tells 60 Minutes.
When Leslie Stahl pushes back, asking why his face is private if it is visible to the public and to many thousands of surveillance cameras, Attick explains a crucial fact about surreptitious biometric tracking: the privacy threat comes not from someone simply viewing his face, but from the identification of his face with his name and other private information about him. The coupling of our face with our confidential data, the linking of these two datasets, is the true threat posed by face recognition technology.
It’s obvious that our faces themselves are not private; we don’t wear masks every time we leave the house. But we don’t tattoo our names, addresses, work histories, social security numbers and other private information onto our faces for a good reason. This technology explodes that division, using our faces to unlock preexisting, enormous data troves about all of us. Our faces, which for thousands of years of human history betrayed hardly more than our ethnic origin and our mood, suddenly expose us completely.
So what can we do? Shall we simply roll over and accept that our children and grandchildren will be subjected to pervasive face tracking for the rest of their lives?
The 60 Minutes program makes it seem as if we must resign ourselves to constant invasions of privacy, to the total loss of anonymity in public. Our credit cards and cell phones already track us, the narrative goes, and face recognition is just yet another — admittedly more invasive — means to erode what’s left of our privacy.
This is needlessly — and dangerously — pessimistic thinking.
We don’t have to roll over and accept that in order to use technology to great benefit we must be ubiquitously tracked by government and big business. Europe has shown us that it’s possible to protect personal privacy while putting to great use the technological advancements that, if left unregulated, could plunge us head-on into a Minority Report nightmare.
A businessman Stahl interviews feeds us a line oft repeated by business and government leaders who stand to gain from invading our privacy. His company merely provides us with “deals” based on the surveillance that is already occurring, he says. We may as well get a free diet Coke in exchange for stores tracking us, the argument goes. We are already being tracked everywhere we go, so why fight it?
That’s a dangerous, authoritarian and fundamentally defeatist assessment of our ability to change our circumstances.
We ostensibly live in a democratic, free society, where we can affect public policy and shape our own destinies. Instead of giving up and throwing in the towel, we can and must legislate to protect our privacy from the government and from corporations. Where legislation doesn’t work or isn’t likely to succeed in the corporate realm, we can boycott companies that intrude into our personal lives in ways we don’t like.
Contrary to the doomsday predictions, we don’t have to accept that our government and major corporations will inevitably violate our privacy and dignity, that we have already lost, that it’s too late. We can and we must fight to preserve and expand what privacy and anonymity we have left.
We don’t need the FBI or big business tracking us everywhere we go, for no good reason. And we mustn’t submit to it, either.