AI augments school security with threat detection

As public schools continue to grapple with on-campus violence, U.S. schools are increasingly looking to artificial intelligence to beef up security.

AI companies have ramped up their offerings…

As public schools continue to grapple with on-campus violence, U.S. schools are increasingly looking to artificial intelligence to beef up security.

AI companies have ramped up their offerings with solutions including perimeter and entry screening for weapons, video and audio-based shot detection, AI analysis for fight and threat detection, student device content analysis and more. 

A recent Gallup poll showed 44% of parents “fear for their oldest child’s physical safety at school,” up from 15% in 2008. 

While the National Center for Education Statistics reports 93% of public schools use security cameras on campus, technology is now being leveraged to make sure “someone” is watching those feeds. 

AI can potentially support school resource officers who don’t have the bandwidth to watch every camera by detecting anomalous patterns in video that could indicate threats. 

“Most schools, many companies, don’t have the resources to have somebody looking at a camera every day, so having AI do that detection for you is like having a million eyes that you wouldn’t ever normally have,” Ben Gehle, chief technology and operations officer for Indian Creek School in Crownsville, Maryland, told local ABC News 7. 

Indian Creek uses the company Volt AI to watch those cameras. 

Volt’s AI startup already has contracts with public schools in 12 states, including Loudoun County Public Schools in Virginia, which has 98 schools, the local ABC affiliate said. 

The price ranges from $385 to $500 per stream annually depending on the company, reports EdWeek. 

“If you’re paying somebody $30,000 a year just to sit and watch a camera, whatever performance level they had, and as sharp as they are, they’re never going to be able to focus consistently the way that artificial intelligence does,” David Wrzesinski, safety director for the Robinson school district in Texas, told EdWeek about the systems. 

Volt’s systems cut response time to emergency events by 60-80% – and with 90% accuracy, according to its website. 

“Unlike traditional camera networks that capture footage for later review, AI-powered solutions analyze situations as they unfold, alerting security teams within seconds of detecting potential threats,” Volt claims. 

Other companies, such as ZeroEyes, Omnilert and Actuate, also provide shot detection services complementing anonymous threat reporting systems. 

The latest threat detection systems include tip lines or apps, social media monitoring, email and cloud document monitoring, and district websites and forms. 

The systems also do perimeter and entry security, with weapon detection. 

These AI systems can notify authorities and staff members quickly about threats – in 2-5 seconds, Volt says. 

AI tools such as Gaggle, Bark or Lightspeed scan school systems and student devices for threats and indications of self-harm, as well. 

“During the 2024–2025 school year, we saw a significant rise in urgent concerns, including a 142% increase in gun-related images and a 24% rise in drug and alcohol incidents,” notes Gaggle. 

The company estimated it scanned nearly 6.9 billion pieces of student content over the 2024-2025 school year. 

While critics contend the systems often flag false positives, experts reply one of AI’s benefits involve learning from those mistakes and providing better security in the future without the fatigue or forgetfulness of humans. 

Omnilert and Volt, for example, use “human validators” to confirm all events reported, EdWeek said. 

As districts weigh their options, the push toward AI-assisted security is appealing, as public schools globally continue to drown in violence problems. 

Parents in the U.S. agree better solutions to violence for kids at school exist than continuous active shooter drills. 

“I’m a father of three, and my kids are terrified when they come home after or before the ‘active drills’ as they’re called these days,” Volt AI CEO and co-founder Dmitry Sokolowski told local ABC News. “And, to me, that’s almost unacceptable in our state, and I wanted to help do something about it.”