0

Will AI crime reports by police hold up in court?

OKLAHOMA CITY 

Associated Press

A body camera captured every word and bark uttered as police Sgt Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour.

Normally, the Oklahoma City police sergeant would grab his laptop and spend another 30 to 45 minutes writing up a report about the search. But this time he had artificial intelligence write the first draft.

Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilmore’s body camera, the AI tool churned out a report in eight seconds.

“It was a better report than I could have ever written, and it was 100% accurate. It flowed better,” Gilmore said. It even documented a fact he didn’t remember hearing — another officer’s mention of the colour of the car the suspects ran from.

Oklahoma City’s police department is one of a handful to experiment with AI chatbots to produce the first drafts of incident reports. Police officers who’ve tried it are enthused about the time-saving technology, while some prosecutors, police watchdogs and legal scholars have concerns about how it could alter a fundamental document in the criminal justice system that plays a role in who gets prosecuted or imprisoned.

Built with the same technology as ChatGPT and sold by Axon, best known for developing the Taser and as the dominant US supplier of body cameras, it could become what Gilmore describes as another “game changer” for police work.

“They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate,” said Axon’s founder and CEO Rick Smith, describing the new AI product — called Draft One — as having the “most positive reaction” of any product the company has introduced.

“Now, there’s certainly concerns,” Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers — not solely an AI chatbot — are responsible for authoring their reports because they may have to testify in court about what they witnessed.

“They never want to get an officer on the stand who says, well, ‘The AI wrote that, I didn’t,’” Smith said.

Before trying out the tool in Oklahoma City, police officials showed it to local prosecutors who advised some caution before using it on high-stakes criminal cases. For now, it’s only used for minor incident reports that don’t lead to someone getting arrested.

“So no arrests, no felonies, no violent crimes,” said Oklahoma City police Capt Jason Bussert, who handles information technology for the 1,170-officer department.

That’s not the case in another city, Lafayette, Indiana, where Police Chief Scott Galloway told the AP that all of his officers can use Draft One on any kind of case and it’s been “incredibly popular” since the pilot began earlier this year.

Or in Fort Collins, Colorado, where police Sgt Robert Younger said officers are free to use it on any type of report, though they discovered it doesn’t work well on patrols of the city’s downtown bar district because of an “overwhelming amount of noise”.

Along with using AI to analyse and summarise the audio recording, Axon experimented with computer vision to summarise what’s “seen” in the video footage, before quickly realising that the technology was not ready.

“Given all the sensitivities around policing, around race and other identities of people involved, that’s an area where I think we’re going to have to do some real work before we would introduce it,” said Smith, the Axon CEO, describing some of the tested responses as not “overtly racist” but insensitive in other ways.

Those experiments led Axon to focus squarely on audio in the product unveiled in April during its annual company conference for police officials.

The technology relies on the same generative AI model that powers ChatGPT, made by San Francisco-based OpenAI. OpenAI is a close business partner with Microsoft, which is Axon’s cloud computing provider.

As the technology catches on, Bussert expects officers will become “more and more verbal” in describing what’s in front of them.

After Bussert loaded the video of a traffic stop into the system and pressed a button, the programme produced a narrative-style report in conversational language that included dates and times, just like an officer would have typed from his notes, all based on audio from the body camera.

“It was literally seconds,” Gilmore said, “and it was done to the point where I was like, ‘I don’t have anything to change.’”

At the end of the report, the officer must click a box that indicates it was generated with the use of AI.

Comments

Use the comment form below to begin a discussion about this content.

Sign in to comment