This Guy Trains Computers to Find Future Criminals
When historians look back at the turmoil over prejudice and policing in the U.S. over the past few years, they’re unlikely to dwell on the case of Eric Loomis. Police in La Crosse, Wis., arrested Loomis in February 2013 for driving a car that was used in a drive-by shooting. He had been arrested a dozen times before. Loomis took a plea, and was sentenced to six years in prison plus five years of probation.
The episode was unremarkable compared with the deaths of Philando Castile and Alton Sterling at the hands of police, which were captured on camera and distributed widely online. But Loomis’s story marks an important point in a quieter debate over the role of fairness and technology in policing. Before his sentence, the judge in the case received an automatically generated risk score that determined Loomis was likely to commit violent crimes in the future. Loomis is a surprising fulcrum in this controversy: He’s a white man. But when Loomis challenged the state’s use of a risk score in his sentence, he cited many of the fundamental criticisms of the tools: that they’re too mysterious to be used in court, that they punish people for the crimes of others, and that they hold your demographics against you. Last week the Wisconsin Supreme Court ruled against Loomis, but the decision validated some of his core claims. The case, say legal experts, could serve as a jumping-off point for legal challenges questioning the constitutionality of these kinds of techniques.
To understand the algorithms being used all over the country, it’s good to talk to Richard Berk. He’s been writing them for decades (though he didn’t write the tool that created Loomis’s risk score). Berk, a professor at the University of Pennsylvania, is a shortish, bald guy, whose solid stature and I-dare-you-to-disagree-with-me demeanor might lead people to mistake him for an ex-cop. In fact, he’s a career statistician.
His tools have been used by prisons to determine which inmates to place in restrictive settings; parole departments to choose how closely to supervise people being released from prison; and police officers to predict whether people arrested for domestic violence will re-offend. He once created an algorithm that would tell the Occupational Safety and Health Administration which workplaces were likely to commit safety violations, but says the agency never used it for anything. Starting this fall, the state of Pennsylvania plans to run a pilot program using Berk’s system in sentencing decisions.
As his work has been put into use across the country, Berk’s academic pursuits have become progressively fantastical. He’s currently working on an algorithm that he says will be able to predict at the time of someone’s birth how likely she is to commit a crime by the time she turns 18. The only limit to applications like this, in Berk’s mind, is the data he can find to feed into them.
“The policy position that is taken is that it’s much more dangerous to release Darth Vader than it is to incarcerate Luke Skywalker”
The moderator, a researcher named Sandra Mayson, took the podium. “This panel is the Minority Report panel,” she said, referring to the Tom Cruise movie where the government employs a trio of psychic mutants to identify future murderers, then arrests these “pre-criminals” before their offenses occur. The comparison is so common it’s become a kind of joke. “I use it too, occasionally, because there’s no way to avoid it," Berk said later.
For the next hour, the other members of the panel took turns questioning the scientific integrity, utility, and basic fairness of predictive techniques such as Berk’s. As it went on, he began to fidget in frustration. Berk leaned all the way back in his chair and crossed his hands over his stomach. He leaned all the way forward and flexed his fingers. He scribbled a few notes. He rested his chin in one hand like a bored teenager and stared off into space.
Eventually, the debate was too much for him: “Here’s what I, maybe hyperbolically, get out of this,” Berk said. “No data are any good, the criminal justice system sucks, and all the actors in the criminal justice system are biased by race and gender. If that’s the takeaway message, we might as well all go home. There’s nothing more to do.” The room tittered with awkward laughter.
No comments:
Post a Comment