Southern Maine police are testing new AI software to write their police reports

Southern Maine police are testing new AI software to write their police reports

Sep. 29—Police agencies in Maine are dipping into the world of artificial intelligence, they say, to help them save on hours of paperwork so they can do more policing.

But experts who have studied this technology question whether it will actually save time, or if it will only bog down and raise more distrust in the criminal justice system.

Lt. James Estabrook demonstrated the potentially time-saving new tool in the parking lot of the Cumberland County Sheriff’s Office in Portland this month. He hopped out of his cruiser, clicked a button on his body camera and walked through a fake traffic stop scenario.

After he pretended to issue a warning to his colleague for speeding, he ended the body camera recording with the click of a button. But behind the lens, the footage was being sent to the cloud to be analyzed by AI which, within seconds, produces the first draft of a police report.

The feature he’s been testing, “Draft One,” is software created by technology company Axon, which makes products for the military and law enforcement. Officers sporting Axon body cameras can upload any footage, from which the software interprets and generates an AI-authored draft of a report based on the audio.

Cumberland deputies tried out the feature for 90 days this summer. So did the Portland Police Department, which has now extended its trial until it can get the money together to fully implement it.

Officers who tried out the tech described it as a time-saving and game-changing feature. They say it produces an accurate summary that sometimes is higher quality than their own writing.

Portland police wouldn’t disclose the price, but Estabrook said the estimated monthly cost is $30.42 per officer, or just under $35,000 per year for the sheriff’s office, a cost the sheriff said they can’t afford right now.

But neither agency wants to be the first in Maine to officially get on board — they anticipate backlash.

Portland police are taking a “low and slow” approach, said Maj. Jason King, waiting to see how other agencies and courts across the country reckon with this tech before deciding to use it in potentially high-profile criminal prosecutions.

“Somebody is going to be a trendsetter and it will probably trigger some Law Court decisions on whether or not this is a tool that can be helpful and reliable,” Cumberland County Sheriff Kevin Joyce said.

They might not have to be the first. Somerset County recently approved a five-year contract with Axon to purchase body and dashboard cameras and the Draft One software, all at a cost of about $840,000, according to reporting by the Morning Sentinel.

“It seems to me that it’s a pretty good bargain,” county Commissioner John Alsop said at a meeting this month. “If this is going to reduce the likelihood of lawsuits and other expensive outcomes, I think I’m in favor of it.”

But the AI innovation is not that simple and has some strong opponents in the legal field.

“Axon wants to make it sound as though this is giving every cop a personal secretary,” said Timothy Zerillo, a defense attorney in Portland and director of the National Association of Criminal Defense Lawyers. “A personal secretary is a human being that can interpret your words … That’s different than having a machine do the same thing.”

THE TECH

Draft One is powered by Open AI, the creator of the generative AI chatbot called Chat GPT. It uses GPT-4 Turbo as its base, which is a more accurate model than ChatGPT that can interpret both text and images to answer questions and solve problems, according to Open AI’s website.

When an officer uploads their video, the Draft One version creates a report filled with optional brackets for them to insert additional details that may be relevant. For example, after a traffic stop, the software could prompt the officer to add the speed limit and observations about the driver. It also allows officers to edit, change and report any errors. They can also use a speech-to-text function to dictate what they want to add.

The final paragraph discloses that the narrative was generated by AI. At the end, the officer must sign their name to testify to the accuracy of the document. Once all of the brackets are removed, they can either submit it through the Axon system or copy and paste it into their own records management system.

It is just one of many products within Axon’s “ecosystem” that can work in tandem with its other devices and software. If multiple officers wearing Axon body cameras are responding to an incident, Axon’s systems automatically link the footage as a part of a single incident. Axon also owns Tasers, drones and other systems that law enforcement agencies depend on. The Portland Police Department is also looking into adding Axon drones to its fleet — at a cost of $40,000.

The software is being implemented amid a tricky, emerging landscape of technology in policing.

Andrew Ferguson, a law professor at American University in Washington, D.C., studies police surveillance technology, privacy and civil rights. He wrote a draft paper analyzing the software’s challenges.

Because the software only reports the facts, Ferguson wrote, it may result in “flattening” the narrative to be too formal and depersonalized. And because the AI system lacks nuance and cultural understanding, it may make unfair judgments, he said.

“Police have never just reported the facts,” he wrote. “Every narrative story of a police encounter involves a choice of how to describe the use of police power. Police reports are not narrations of the facts but are narrations of human interpretation of the facts.”

Even body cameras themselves, which can be turned off, can fail to capture objective data, he said. They record suspects, but not the officer’s actions. And because the Draft One technology relies on audio, Axon encourages the officers to narrate the situation.

“The same subtle biases observed in the video will be replicated on the transcript,” he wrote. “Perhaps more concerning, a sophisticated police officer will be able to narrate the facts to suit their desired outcome.”

Artificial intelligence models can be “trained” by feeding in certain data to produce certain results, like a police report.

Because large language models like GPT-4 are trained by people with their own biases, sometimes they produce content that can be harmful toward certain communities, said Jay Stanley, a senior policy analyst with the American Civil Liberty Union’s Speech, Privacy, and Technology Project.

“We all sort of intuitively understand … how (human) memory works, how storytelling works, but these AI agents are a really alien being,” Stanley said. “It doesn’t make sense to throw them into the middle of the criminal justice system, and the concern is, injustices will result.”

Axon’s team used the Open AI model to generate “realistic” police report narratives and then fed it to the Draft One software to train it, according to a spokesperson. The company also chose police report narratives from agencies that opted into Axon’s voluntary program for product development.

Axon also boasts that the model has been calibrated “to prevent speculation or embellishments” and just stick to the facts when producing reports. The company said it conducted an internal study that reported Draft One was nonbiased and objective.

What’s unclear is how Axon makes that calibration, and a spokesperson did not return several follow-up emails asking about it. Ferguson said how that calibration is made should be disclosed because the product is used for public safety by public officials.

“You have some obligation to explain how the technology was created if only just to quiet the people who might be asking hard questions,” he said.

But Estabrook, the Cumberland County lieutenant, said he feels comfortable with how accurate Draft One is and would use it when responding to any incident. The narrative only recounts what the body camera footage shows and doesn’t make anything up, he said.

“Be concerned if you want … be apprehensive … but read this final proof and watch the video,” he said. “The two should be darn near (the same).”

IS IT EFFICIENT?

Estabrook said although the tech wouldn’t be “massively” cost-saving, it would reduce overtime pay and make officers’ jobs easier.

The sheriff’s office first tested the tech with a few supervisors and three deputies, who Estabrook said varied in their report-writing skills. Then, for the final few months, the tech was turned on for everyone to try out.

Estabrook said most officers spend about three hours a day writing reports. Using Draft One, he said, a report that would usually take an hour to write took 10 to 15 minutes maximum.

At Portland police, the eight or nine officers using the technology found their report-writing time was cut nearly in half, from about 13 hours a week to about 6.5 hours, King said.

In Axon’s marketing materials, the company says that one of its customers, Colorado’s Fort Collins Police Services, saw an 82% decrease in report-writing time.

But at least one study discounts these claims.

Ian Adams, an assistant professor at the University of South Carolina’s criminology and criminal justice department, analyzed how officers in the Manchester, New Hampshire, police department interacted with the AI feature. The study, planned to be published this fall in the Journal of Experimental Criminology, found that there were no significant time-savings.

“Instead of assuming success, scholars and practitioners should be more open to the possibility that these tools might not deliver on all fronts and adjust our expectations accordingly,” wrote Adams, a former law enforcement officer.

He said all studies have limitations, so that may account for the discrepancy in his findings versus the other agencies. But he said self-reporting might also be to blame.

While the actual AI-generated narrative is written in seconds, the whole report-writing process can take about the same time as without it. Some agencies also have different expectations about what a report should look like, he said.

“All of this complexity is a really good reason to do careful, experimental work so that we can get away from self-reports,” Adams said. “When we’re spending this much public money, I think it’s incumbent upon us to be really careful in those measurements and really figure out if the tool … is accomplishing the explicit goals we have for it.”

IS AI FIT FOR COURT?

Some experts and defense attorneys are also concerned about how this AI tool fits into the legal system.

The sheriff’s office and Portland police met with the Cumberland County District Attorney’s Office and some assistant attorneys general this summer to discuss the use of Draft One in criminal cases.

District Attorney Jacqueline Sartoris said she was initially very concerned with the idea, but after the prosecutors left the meeting, they understood how it could be a helpful tool for police. Sartoris said that since the agencies are just piloting the tech, she doesn’t expect to see it being used right away in criminal cases.

“If AI is chosen to be used within those departments, it’s going to be based on an ongoing conversation about when we feel comfortable, where we think it’s appropriate and questions that we have,” Sartoris said.

She said she’s not aware of any AI-generated police reports that have made it to trial.

King said the Portland Police Department is only using Draft One when responding to low-level misdemeanor crimes. But Estabrook said the sheriff’s office inevitably used it in crimes that ended in criminal prosecution, though he was not aware of how many. A majority, he said, were likely misdemeanor offenses.

Ferguson, a law professor at American University, questions how people can evaluate or hold this software accountable in the judicial system. A police report can be a controlling document, sometimes the only thing that a prosecutor will review before filing charges, he said. And in a majority of cases that don’t end in trial, the body camera footage won’t ever be reviewed.

In a trial, a report’s legitimacy could be questioned and the court will need artificial intelligence experts to explain, putting an extra burden on the courts, he wrote.

And errors can happen. An officer may accidentally perjure themselves because of a mistake in the report, he said, and because of that, legislators need to be involved in creating regulations for the technology.

To get ahead of those concerns in court, Ferguson suggests officers should turn over the audit trail that Axon can produce, which identifies which parts of the narrative were AI-generated. Ferguson said it could help alleviate a judge’s worry by showing which details in the report were added by the officer.

“If you don’t explain at the front end, you’re going to get a lot of critique,” he said. “It’s only going to take one careless officer and one error to potentially blow up the entire system based on AI-generated police reports.”

Zerillo, the defense attorney, said that while he discourages this use of AI, auditing the reports is a first step. Each agency should also publicly provide their standards of practice with the technology, he said.

He submitted a public records request to the Portland Police Department for its policy on using Draft One. While the department has a draft policy, King said, he would not provide a copy because it’s not finalized.

Ferguson said that, too often, policing launches a new technological innovation, a major scandal or mistake occurs, and then there is a rollback. He said before departments sign a contract with Axon for Draft One, the public and city councils should be engaged in shaping policies around this technology.

“In America today, you would start asking those hard questions before the scandal happens, before a mistake happens, instead of waiting for that to happen,” Ferguson said. “Yet, we don’t seem like we’ve ever learned that lesson.”

Adams said that while police departments shouldn’t have to wait for researchers to identify every consequence of this technology before implementing it, they should set goals to review whether it’s actually doing what they had hoped.

“I think it’s too early to panic and too early to celebrate,” Adams said. “We just don’t know.”

Copy the Story Link

Originally Appeared Here