We’ve always been wary of putting cops in schools. Putting cops in schools just means administrative issues (i.e., student discipline) get the law enforcement treatment, which turns misbehavior into criminal acts and generates exactly the sort of school cop overreactions you’d expect.

Adding AI-assisted tech was never going to improve anything. Administrators were certainly told by salespeople that things like “gun detection AI” would not only increase safety, but reduce the number of personnel needed to keep guns out of schools.

Evolv has taken most of the heat in recent years for providing under-performing gun detection tech to schools and, with the outgoing NYC mayor’s blessing, city subways. In the first case, public records showed Evolv’s tech tended to flag a lot of harmless, normal things that students take to school (3-ring binders, laptops) as guns, suggesting the tech was really just an over-engineered metal detector.

In the latter case, Evolv explicitly told the city of New York that its tech would under-perform if installed in subways. After all, it had already under-performed in a Bronx hospital, where it had racked up an 85% false positive rate. Any place with a lot of electricity, electronics, and lots of people moving around constantly apparently overwhelmed the tech. But since Mayor Eric Adams was spending other people’s money, Evolv’s tech was placed in subways… where it immediately engaged in the expected under-performance.

Omnilert is the other big player in the school gun detection AI market. Its track record isn’t a whole lot better. Earlier this year, its tech failed to detect the gun carried by the student who killed one student and injured another before turning the undetected gun on himself.

Omnilert is back in the news, and it’s brought a cops along for the ride.

Armed police handcuffed and searched a student at a high school in Baltimore County, Maryland, this week after an AI-driven security system flagged the teen’s empty bag of chips as a possible firearm.

[…]

“They made me get on my knees, put my hands behind my back, and cuffed me,” Kenwood student Taki Allen told CNN affiliate WBAL, describing what happened Monday evening when police arrived at the school while he was waiting with friends for a ride home after football practice.

Yep, that’s what they’re paying for in Baltimore County, Maryland: the sort of incident that not only makes readers desperate to search out the perfect “Takis” take on the AI blunder, but prompts dozens more to simply ask:

The absurdity and inadvertent hilarity can really only be appreciated by those who weren’t held at gunpoint by a whole bunch of police officers.

“The first thing I was wondering was, was I about to die? Because they had a gun pointed at me,” Allen told WBAL, saying about “eight cop cars” pulled up to the school.

Here’s how the Baltimore County PD phrased its response to the incident, which leaves out all the stuff about a swarm of cops and guns being pointed at someone who was holding nothing more dangerous than a handful of saturated fat.

The department told WBAL officers responded to “a report of a suspicious person with a weapon” but determined the person was unarmed after a search.

I know it’s unreasonable to expect a detailed play-by-play from police spokespeople, but this makes it sound like a couple of officers calmly approached someone, did a quick frisk, and went back to calmly patrolling the neighborhood. And while I understand the alternative to a swift, potentially violent response to a suspected threat near a school is pretty much just asking for another Uvalde, there needs to be some sort of backstop between AI thinking it saw a gun and cops showing up and waving around the only actual guns on the premises.

Omnilert has apologized, but its apology contains a statement that drastically redefines a commonly used term:

Omnilert, the company that operates the AI gun detection system, expressed regret over the incident and emphasized that its system is designed to identify a possible threat and elevate it to human review.

[…]

“While the object was later determined not to be a firearm, the process functioned as intended: to prioritize safety and awareness through rapid human verification,” the company added.

Calling the cops is not the same thing as “human review.” “Human review” is the missing step. And it’s a crucial one. It’s the thing that could possibly prevent someone from being killed by police officers who’ve been told the person they’re looking for is armed when they clearly are not.

Now that the thing directly adjacent to the “Worst Possible Outcome” (the killing of an unarmed minor by police officers) has happened, the district is promising to do a bit more of the “human verification” Omnilert claims is already happening. It will also apparently be asking Omnilert why this happened. And I imagine Omnilert will simply tap the canned statement it made — the corporate version of thoughts and prayers — and suggest it’s the school’s fault that it trusted the AI’s alert, even though “you can trust the robot” is likely makes up a significant portion of Ominlert’s sales pitch.

Oh. Wait. That’s exactly what the Omnilert sales pitch says:

Our expertise in AI has roots in the U.S. Department of Defense and DARPA related to real-time target recognition and threat classification. That military focus on high reliability and precision carried through to the development of our AI threat detection that goes beyond identifying guns to finding active shooter threats.

We employ a data-centric AI methodology that prioritizes high-quality training data. While traditional methods focus on data volume, sourcing millions of gun images, we take a quality-over-quantity approach. Our training data is hand-curated with rich annotations that improve accuracy and increase reliability.

Until the school either kicks this tech to the curb or forces Omnilert to do better, students would be wise to follow the announcements posted on the bulletin board and only purchase Doritoguns from vending machines during their lunch period or other pre-approved break period.


From Techdirt via this RSS feed