View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. AI and automation
March 15, 2021updated 29 Jul 2022 9:45am

This start-up is using AI to help prevent mass shootings

By combining gun detection software with CCTV, ZeroEyes hopes it can reduce police response times – but concerns persist about privacy and the technology’s reliability.

By Greg Noone

It was after his daughter came home from school in tears that Mike Lahiff resolved to do something about mass shootings in the US. She had returned, disturbed and frightened, after a “lockdown drill”, a training exercise the school had introduced in 2018 following a school shooting in Parkland, Florida that left 17 pupils dead.

Several days later, Lahiff attended one of his daughter’s sports events. He noticed the CCTV cameras perching on the school walls and asked a security guard how the footage was used. “He kind of chuckled and said, ‘We only use them after something happens’,” recalls Lahiff. It was a lightbulb moment. “I was like, wait a second: why don’t we use cameras to detect guns so we can help with response times?

ZeroEyes is one of several start-ups that combine visual AI with CCTV footage to detect unholstered weapons. (Photo courtesy of ZeroEyes)

Shortly afterwards, Lahiff founded ZeroEyes, a company that uses visual AI to detect when someone is carrying an unholstered weapon in CCTV footage, before alerting law enforcement. It is among a wave of start-ups claiming the technology can slash response times significantly, buying more time for civilians to shelter in place and for police to apprehend the shooter. “Our alerts will get to our clients within three to seven seconds,” says Lahiff – a significant improvement on the average police response time of 18 minutes

Some have been left uneasy by this marriage of CCTV footage – some of variable quality – with computer vision software. For an AI, an automatic weapon may appear to be little more than a “than a dark blob on the camera screen,” as Tim Hwang, an expert in AI ethics, explained in an interview with Undark. This can easily lead to false positives – the gun detection system at a New York high school misidentified a broom handle as an automatic weapon. 

This problem inevitably derives from poor training methods, says Lahiff, something ZeroEyes discovered early on when it initially trained its AI on images of weapons scraped indiscriminately from the internet (“It worked like garbage,” he recalls.) 

The startup quickly pivoted to a more practical training method. “All of our data that we use to train our AI models is built in-house, explains Lahiff. “We’ve filmed ourselves walking around with a plethora of different weapons and guns in a bunch of different environments: schools, office buildings, malls, even things such as water parks. And then we meticulously annotate those images.”

The approach – combined with an insistence that the footage used is of a suitably high definition – has led to a vast increase in the accuracy of ZeroEyes’ software, Lahiff says. As an added safeguard, the start-up employs veterans at two control centres to rapidly verify the AI’s conclusions before an alert is made. Now embedded in CCTV covering schools, malls and offices across the US, ZeroEyes claims that its software has issued no false positives to date.

Content from our partners
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape
Green for go: Transforming trade in the UK

Tackling mass shootings through AI: privacy worries

Despite the promise of the technology, some privacy advocates have raised concerns about the use of CCTV footage by gun detection start-ups. “There could be a chilling effect from the surveillance and the amount of data you need to pull this off,” said Hwang. Others have sounded the alarm over the combination of gun detection with facial recognition – a technology widely criticised for its problems with accuracy and racial bias

Lahiff says ZeroEyes isn’t interested in integrating its software with facial recognition or using the footage for other purposes. “Our focus is on weapon detection,” says Lahiff. “We don’t store or record video from our mind sight. We only have the alerts that are sent to us, they are the only thing that’s stored, and then purged.”

ZeroEyes’ approach is intended to increase the safety of students and office workers in a horrendous scenario, the prevalence of which has increased during the pandemic. But could the knowledge that they are being watched by AI make shooters more careful in evading detection

Lahiff is sanguine on this point. Even if shooters “wait until the last second to pull that weapon out, eventually they’re still going to pull that weapon out,” he says – which means that ZeroEyes’ software will still detect the gun and issue an alert. Ultimately, says Lahiff, “it’s still going to help in that situation to decrease those response times and give better situational awareness to those first responders”.

Topics in this article : ,
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU