Life as a police officer isn’t all high-speed chases and locking up bad guys. Often, crime fighting means paperwork. Lots of paperwork. Police officers must write lengthy reports after most calls, check-ins, and altercations. Could AI help save time with the reporting? And if so, would those reports be more or less trustworthy than the firsthand, human account?
Sergeant Matt Gilmore works for the Oklahoma City, Oklahoma, police department. On duty, he wears a body camera with a microphone. Usually, he returns to the police station after any call and spends up to 45 minutes writing a report. But not this time. Now he has AI do the work.
Gilmore uses a program called Draft One. It’s made by a company called Axon and built with the same technology as ChatGPT. Artificial intelligence analyzes all the sounds captured by Gilmore’s bodycam microphone. In eight seconds, it churns out a police report.
“It was a better report than I could have ever written,” says Gilmore. “And it was 100% accurate.” The AI even documented a fact he didn’t remember hearing.
Draft One and similar programs are popping up in police departments across the United States. Many police stations already use AI to read license plates, identify gunshot sounds, predict crime locations, and even recognize faces—though not always with precision.
So when it comes to AI-generated reports, Oklahoma City’s local prosecutors urge caution. A police report helps a judge decide whether a suspect goes to jail. In some cases, it’s the only testimony a judge sees.
AI isn’t perfect. Technologies can produce false information. In the tech world, this phenomenon is called “hallucination.” That’s part of the reason behind Oklahoma City’s decision to set rules for tech use in law enforcement reporting.
Oklahoma City police officers can use Draft One only to create—well, a first draft. Officers might be called to testify in court about the accuracy of their reports. If they don’t actually know what’s in them, that’s a problem.
“They never want to get an officer on the stand who says, well, ‘The AI wrote that, I didn’t,’” says Axon founder and CEO Rick Smith.
In cities like Lafayette, Indiana, and Fort Collins, Colorado, officers can use AI for any type of report. But in Oklahoma City, officers can use it only to report on minor incidents. If someone is arrested, the officer must write a report the old-fashioned way.
But the old-fashioned way isn’t foolproof either. People can make mistakes and may have biases. Some experts think AI reports might be more accurate—and more just.
God wants judges to make just judgments. (Leviticus 19:15) In a world of ever-changing technology, that’s not always easy. Should police forces embrace AI as a tool in reporting? Should they stick exclusively to human writing? Or should they go a third way—using AI with boundaries, as in Oklahoma City?
Why? New technologies can save time and effort, but without care, they can create more problems.