Minneapolis plans to invest in software to flag problem cops

Derek Chauvin had done it before.

Three years before he encountered George Floyd outside Cup Foods, the veteran officer hit 14-year-old John Pope with a flashlight and held him by the neck for 20 minutes. That same year, Chauvin pressed his knee into Zoya Code’s throat in what a recent lawsuit Code filed called Chauvin’s “signature move.”

Despite this pattern of violent policing, the Minneapolis police force kept Chauvin on the street — and even trusted him to train newcomers to the field.

Now, Minneapolis intends to invest in new software designed to raise red flags at the first sign of officers exhibiting patterns of dangerous behavior. With help from a grant from the Pohlad Foundation, the city is expected to spend $1.25 million over the next five years to purchase and maintain a new “Early Response System”: Catastrophic Incident.

Early warning systems are not new; American law enforcement agencies have been using them in some form for 40 years. But with evolving technology, the latest generation of developers are selling automated software that is far more advanced than the pen-and-paper systems of the 80’s and 90’s. And they come with big promises for city governments looking to avoid costly lawsuits or face accountability crises.

Ron Huberman, CEO of Chicago-based Benchmark Analytics, calls his product “the holy grail of police reform” for its data-driven approach to tackling police behavior. Benchmark was launched in 2017 and is now one of the industry leaders. Its website calls its product a “revolutionary all-in-one solution” at a time when “policing in America is at a crossroads.”

Some police departments have implemented Benchmark’s system to boost public confidence, such as that in Harvey, Illinois, which is trying to restore its reputation after a corruption scandal led to an FBI raid on the police station.

But even the most modern technology is not capable of completely eradicating police misconduct, said Seth Stoughton, a professor at the University of South Carolina Law School who studied early warning technology.

“[They] are not a Minority Report,” he said, a reference to Philip K. Dick’s story about mutants seeing crime before it happens. “They are not predicting with absolute certainty that this official will do anything wrong in the future.”

And Minneapolis has been trying to implement similar systems effectively in the decade before Chauvin’s killing of Floyd, revealing the technology’s greatest weakness: the people who use it.

Red flags to save careers

Early intervention technology has become mandatory for some US police departments.

After a year-long investigation by the Justice Department uncovered a pattern of unsafe and racist behavior at the Chicago Police Department, the city signed an assent ordinance agreeing to a series of sweeping reforms. The decree required Chicago police leadership to adopt early intervention technology that would “enable it to proactively detect risky behavior by officers under its command.”

The University of Chicago helped develop a new system to follow the agreement’s regulations, e.g. B. by providing police officers with a data dashboard with shared metrics and enabling “peer group analysis” to uncover problematic entities within the police force.

Minneapolis could soon face a similar directive. The city is negotiating a consent order on illegal policing with state human rights officials and expects another from the Justice Department in the coming months.

In recent months, law enforcement and IT officials have met to develop desired features for an early intervention system, and the city plans to launch a call for proposals in the next few months, Minneapolis Police Commander Chris Granger said.

Granger said the department wants a system capable of deep data analysis, one that can pull from the city’s disparate computer systems and offer comparative analysis of one officer versus another. They also want a system that can learn, with enough nuance to identify priority cases of officers who are “really at risk of poor performance and in need of intervention” and store data on past incidents from initial alert to resolution .

Each intervention system is different, but a typical one weighs factors such as excessive use of force — a head smack or a few punches or kicks in a short window of time can set off a red flag — disciplinary action, alcohol or other substance-related incidents, and citizen grievances.

Benchmark’s Huberman said a small percentage of police officers are responsible for 87% of “major prejudicial investigations.” The likelihood of such an incident increases exponentially when high-risk police officers work together. The system can alert police officers early enough to intervene through targeted training, staggered officer shifts, or other “non-disciplinary” measures that can save money in the long run.

Some developers call their products officer support systems, which also measure compassion fatigue or other psychological problems in law enforcement agencies.

Vector Solutions-Acadis, an Indiana-based company, advertises its early intervention technology as also being able to flag positive behaviors for police leadership, which the website says “inspires motivation and sustained success.”

The goal of the system is not just an internal affairs component, but “to save careers,” said Paul Boulware, a retired police commander who now works as a project consultant for the company’s Guardian tracking software.

Vector has 1,400 customers nationwide, including emergency, fire and 911 centers, in all 50 states and in Canada. Boulware said the software is designed to help police create a “collective knowledge pool” for leadership to better manage officers. Features also include compliments from officers to officers and flags for officers who may need advice after responding to difficult calls.

But even with “all the tools in the world,” he said, “it’s not going to work if people don’t get actively involved in the department.”

Earlier efforts have died on the vine

Some police departments see downsides to early intervention technology, Stoughton said.

Creating a paper trail for problem officers could become a liability if the officer gets caught in a lawsuit, as it could make it seem like police haven’t done enough to stop it, he said.

A police department’s culture can also resent technology when officers — particularly at the top — see it as a punitive digital babysitter.

Stoughton compared the early intervention technology to body-worn cameras. If the officials don’t turn them on, they don’t make a difference. In the same way, failure to add data accurately and consistently is an example of how human error can skew results and hamper a system’s effectiveness, Stoughton said.

“Even if you have a perfect input, there’s a human factor: what do you do with the output?” he said. “Because you can’t rely on it too much, you can’t rely on it too little. And that’s not a problem that technology can solve.”

On the human side, Minneapolis has failed in the past.

In 2015, a Justice Department audit led by Chief Janeé Harteau found significant “loopholes” in an early intervention system the city installed six years earlier.

Police officers did not believe in the system or disagreed about what constituted “problematic behavior,” according to the Justice Department report. They also saw it as more of a Human Resources-style “officer well-being program” than an accountability and risk management tool. And the “lack of automation” prevented systematic electronic flagging of behavior of concern.

“MPD should develop a new, prevention-focused EIS that incorporates broad stakeholder input, improves officer performance, manages risk, provides a continuum of interventions, and is supported by an automated information system,” the Department of Justice audit concluded.

After that, the city established a steering committee composed of police chiefs, a city attorney, police union representatives, city employees and members of the public to develop parameters for a more effective system. The committee decided that the system should track use of force factors, such as punches, kicks and bites from police dogs, as well as complaints and grievances to Internal Affairs or the Office of Police Conduct Review, according to the 2017 meeting minutes. It would also track positive performance metrics, such as letters or community Feedback, media reports and performance reviews.

David Bicking, a member of the steering committee, said he was “virtually certain” the system they developed would have flagged Chauvin, against whom at least 16 misconduct complaints have been filed.

But the city never implemented it — at least not in the way it recommended, Bicking said. “Like so many things, the ball was dropped. It died on the vine.”

Although the police agency told human rights investigators that it had invested in developing early intervention technology, the state noted that the system was “nonexistent,” according to results released in April.

The human rights inquiry found that the lack of early intervention technology prevented police from identifying officers who needed assistance, such as one who said he was “paranoid” about black men.

A senior police officer described the early intervention system as just a “spreadsheet.” Another said they didn’t know another existed.

Another described the MPD’s early intervention system “as ‘a unicorn,’ suggesting it was imaginary.”

Leave a Comment

%d bloggers like this: