Privacy SOS

California’s “computerized risk-assessment” tool doesn’t work so well, but Massachusetts should use it?

The Boston Globe is calling for Massachusetts state legislators to push through a provision of the fiercely debated "three strikes" law before an independent commission releases a report on statewide criminal justice in March. 

The provision would enable the parole board to make determinations regarding release and parole using a "computerized risk-assessment tool." The tool would analyze past behaviors and records of incarcerated people in order to determine likelihood for future criminal activity.

But California hasn't had such great luck with similar technology. In a 2011 inspector general report, auditors found that their computerized risk-assessment system had an initial error rate of 23.5%, before levelling off at an error rate of 8%, which according to former California state senator and former prosecutor Ted Lieu "continues to this day." 

How could the system make so many mistakes? As ever when it comes to the titilating subjects of data management and internal policy production, the devil is in the details. Data input, more precisely. Lieu writes:

The inspector general discovered that one reason for the high error rate is that the database the Department of Corrections uses in making risk determinations is missing critical information — the disposition of prior arrests and convictions — for nearly half the cases. In other words, of the more than 16.4 million arrest entries in the database, 47% are missing data on whether a conviction occurred, which is a crucial data point in the formula for eligibility for non-revocable parole.

Massachusetts should take note of these data entry problems when it considers implementing a computerized risk-assessment scheme to handle the hot-button issue of parole. Efficacy matters, and if the legislature follows California and others' leads in using this technology, it should be sure that it studies their mistakes before implementation.

But raising questions of efficacy means we have bypassed questions of democratic governance, which is not good policymaking. In other words, a tactic might work in practice but could come with attendant side affects, including the destruction of core democratic principles.

After all, aren't we supposed to be innocent before proven guilty? Should that now read "innocent before until proven declared guilty of future crimes"?

In a society wherein predictive policing — the notion that the government can stop crime if it knows everything — is becoming the core law enforcement ideology, enabled by a rapidly advancing total surveillance security regime, we need to ask more basic questions about the role computers play in governance and policing. 

If the government knew everything about everyone, could it stop every crime? Maybe. But do we want to live in that world? 

© 2024 ACLU of Massachusetts.