Privacy SOS

The biometrics future is now

Please note that by playing this clip YouTube and Google will place a long term cookie on your computer.

Iris scans; face, palm and finger prints; ear patterns; gait and voice recognition; and tattoos and scars: these are among the biometric and physical indicators that the federal government wants to collect in order to identify us — even without our permission or our knowledge. 

Cutting-edge biometrics research spearheaded by the military for deployment in foreign wars is coming home to roost. The Department of Defense has for years been investing millions of dollars into biometric identification technologies, pushing the boundaries into the sci-fi realm. The world depicted in the dystopian film "Minority Report" is about to arrive in the flesh — and unless we act to arrest the flow of information into the government’s giant identification databases, we may wake up one day finding that we have zero anonymity or privacy left.

These federally funded research projects on biometrics give us a sense of where the government is headed:

“Multi‐Modal Biometric Recognition of Non‐Cooperative Subjects at a Distance”

“Uncooperative Biometric Identification at a Distance”

“Neurophysiology‐Based Bilateral Asymmetry: A Face in Conflict as a Standoff Biometric Signature of Hostile Emotion or Malicious Intent?”

While the federal government is the driver of technological innovation in biometrics identification tools, the private sector, universities and local governments have jumped on the bandwagon. Major corporations like Lockheed Martin — which received the billion dollar contract to build the FBI's new biometrics database — and small start-ups alike are cashing in on the next big thing in government monitoring. The only problem is that no one seems to have consulted the people of the United States about whether or not we think this is a good idea.

The news from the past few weeks alone provides a number of examples of the biometrics trickle-down: Boston University is using the technology on its student meal cards. The University of Vermont has installed fingerprint readers to control who gets into school buildings. Schools are using the technology to track and monitor students, even though a UK study showed that these institutions can’t be trusted to manage such sensitive data. And a private company is marketing an iris scanning tool that snaps an image of children before they board the school bus and sends an email to their parents, alerting them that their kid has gotten on the bus and is headed home.

Private companies may profit off the biometrics craze, but the credit for funding the boundary-pushing research belongs to Uncle Sam, and largely started with the military. One estimate puts government funding of biometrics research at about $450 million per year. And that’s just the research and development track; the feds are also investing billions in implementing biometrics schemes, not least of which is the FBI’s latest contribution to the matrix: the Next Generation Identification (NGI) database.

According to the bureau, the “NGI Program Office mission is to reduce terrorist and criminal activities by improving and expanding biometric identification and criminal history information services through research, evaluation, and implementation of advanced technology.”

Translation? In short, that means the FBI will be collecting and storing as much personally identifiable biometric data on as many people as possible, and then deploying that information to identify people in the street, at border checkpoints, and in central booking — or perhaps in your home or office, via a tiny drone that looks like a bug.

But it's just for criminals, right? 

The FBI says its project aims to “reduce terrorist and criminal activities”, but a thorough look at the bureau’s plan reveals that the soon to be active NGI database will contain information on people the government doesn’t even suspect of any crimes, including anyone who works for the federal government or has ever applied for a civil license with an agency that checks data against the FBI’s criminal database. (In Boston, that includes bike messengers.) But that's not all.

There are other ways your information might end up in the system, too: Some states are piloting programs wherein they share all drivers license photos with the FBI for inclusion in NGI. The government might even take information from open or quasi-open sources like Facebook to download face prints of hundreds of millions of people in the United States — no probable cause or suspicion of a crime required.

In short, the NGI database aims to transform the 10 point fingerprint identification system the bureau has used for decades into one piece of a much larger, futuristic identity science, incorporating means of identifying people that do not require any physical interaction whatsoever. In the not too distant future, if we don’t act to bring the Bill of Rights into the digital age, government agents will be able to identify you and pull up your dossier simply by zooming in on your eye as you walk down the street, capturing your iris scan and running it against the NGI database. 

The biometrics data collection doesn't begin or end in the United States, however. The Department of Defense, Department of Homeland Security, Department of State and Department of Justice are integrating their biometrics databases, meaning the agencies will have access to information about hundreds of millions of people worldwide, including anyone who enters or exists a US border checkpoint. And the government has signed biometrics sharing agreements with other countries, meaning your iris scan might end up being used against you in Turkey or Israel, not simply in Houston or New York.

Please note that by playing this clip YouTube and Google will place a long term cookie on your computer.

Towards the future

The government's plan for NGI appears to be largely about data collection. Meanwhile, agencies are funding rapid advancements in the technologies that will put our biometric data to use identifying us, even when we'd rather stay anonymous or private.

Many of these studies hope to achieve better results for face recognition, particularly in low light and other imperfect conditions, such as capturing someone's face off of a CCTV camera and running the image against face prints to identify the target. One such study takes the concept of live video identifications to the next level: 

Beyond basic image-to-image comparison of faces, there have also been breakthroughs in 'video-to-video' matching and 'still-face-to-video' matching. Improvements in face recognition technology will continue with programmatically focused investments by government agencies. For example, the U.S. Special Operations Command (USSOCOM) funded a proof-of-concept effort to create handheld, mobile binoculars capable of automatic face recognition at ranges up to 100 meters in outside daylight.

Among the other studies the government has funded are those that aim to identify “hostile subjects” in crowds — in other words, people who would rather not be identified. One such project 

could enable some automated surveillance for discovery, detection, identification, and quantification of malevolent emotions — and for related hostile intent. This is an emerging option for eventual data fusion with state-of-the-art iris or facial recognition, facial thermography, vascular pattern, palm geometry, and/or odor typing. Furthermore, some biometrics systems being operationally tested in the field already incorporate remote video camera equipment, particularly for iris detection or enrollment, which could be dual-purposed for detecting and identifying facial emotional expression. Some smartphones already have rudimentary facial and emotional recognition capability, so it would be useful to study these relatively inexpensive technologies and assess their capability for field use in combination with other higher-technology equipment that is already in place.

Why would the government want such a tool? The theory is that the “newly detected and identified “Face in Conflict” phenomenon of mixed polite smiles with intense negative emotions on one face provides a way of identifying individuals who are not on watch lists and do not have any prior arrests or detainments on record.

In other words, if you haven’t been arrested and aren’t suspected of terrorism but really do not like TSA agents rifling through your stuff or asking you to pose for a shake down, don’t make a face about it. That's because

a dynamic measurement of a persistent facial behavior such as negative or conflicted emotional state while walking through a whole-body scanner or being patted-down by a security officer, could provide a potential indicator of malicious intent in individuals who have not been watch-listed or identified in any other way as potential bad actors. This would provide a more objective cue for detaining that individual for secondary questioning or inspection.

Another federally funded project aims to identify “uncooperative” people at a distance of multiple meters. “The proposed project aims to create a high-accuracy, high-throughput biometric identification system that works with both cooperative and uncooperative subjects at multi-meter distances.”

According to the national Sandia labs, researchers have had success identifying subjects outdoors “out to 20 meters,” and are working on improving “the performance of the system, particularly illumination and tracking.” Ultimately, the DOJ wants to be able to identify people from a distance of 1,000 meters — or over half a mile.

The “Multi-Modal Biometric Recognition of Non-Cooperative Subjects at a Distance” project “will collect multiple biometrics including whole body images, face images, eye (iris) images, and ear images of subjects walking normally.  The system will automatically illuminate and capture these biometrics which will be used for biometric recognition research and experiments to test the ability of a computer to automatically identify who is walking by the system.”

It's a brave new world, so what can we do to ensure that these futuristic technologies don't utterly destroy our privacy? The first step is to stay engaged and aware of what's happening around us. We also need to continue to talk to our friends, families and neighbors about what's happening, largely without our consent. Finally, we need to take this information and our concerns to our elected officials and support their proposals for getting a handle on digital surveillance before it's too late.

Senator Al Franken held hearings on face recognition at the end of the last legislative session — a great first step. Now Congress needs to act to ensure that we retain our rights to be free from unwarranted government intrusion, no matter what the technology allows. If we have the right to speak anonymously, we must retain the right to be anonymous. Biometrics fundamentally threatens that right, but there's still time to stop the worst case scenario before it's too late.

© 2024 ACLU of Massachusetts.