View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Leadership
  2. Innovation
January 11, 2022updated 29 Jul 2022 11:03am

How the Internet Watch Foundation uses tech to tackle child abuse images online

Internet Watch Foundation CTO Dan Sexton explains his organisation's role in combating the spread of illegal images online.

By Matthew Gooding

As our work and social lives went digital during the Covid-19 pandemic, a much darker trend was also unfolding; a sharp increase in the number of people accessing online child sex abuse images. The Internet Watch Foundation (IWF), a charity dedicated to finding and removing these images from the web, detected 8.8 million attempts to access illegal material during the first month of the 2020 lockdown in the UK alone. The scale of the problem is likely to be much larger.

IWF CTO

The Internet Watch Foundation assesses hundreds of thousands of online child abuse images a year. (Photo by Serghei Turcanu/iStock)

For the IWF, the pandemic has exacerbated an already worrying increase in the volume of child abuse material being shared and viewed online. It says the amount of images and videos it is detecting each year has grown 1,420% since 2011, and in November said its analysts had detected 200,000 illegal images in 2021, the first time it had reached this grim milestone in a single calendar year.

But while technology is part of this problem, it can also be part of the solution. Tech Monitor spoke to the IWF’s chief technology officer Dan Sexton about how his team is developing bespoke software to support the charity’s work.

IWF CTO on the importance of staying on-premise

Sexton joined the IWF as CTO in February, having previously led information and computing services at the University of Cambridge’s engineering department. “I started my career on IT helpdesk 20-something years ago, asking people to turn their computers off and on again,” he recalls. “Since then I’ve worked in the public sector in local authorities and academia, as well as spending some time in private sector software development.”

As CTO, Sexton is responsible for the charity’s technical department, which includes a team of three people overseeing IT infrastructure and a four-person development team which is currently being expanded. “My job has three main areas: one is internal IT infrastructure, one is software development because we do a lot of bespoke product development for the area we work in, and the last part of it is being a voice of expertise for our senior leadership team, advising our board and talking to external partners in government and industry,” he explains.

This final part of the job has been one of the most enjoyable, Sexton says. “A lot of the people who work in this space are very policy-focused, and having a voice with some technical experience behind it has been welcomed,” he says. “Being able to translate very technical things in a way people understand has been really fulfilling, and I’ve always felt like a welcome voice in the room.”

Being able to translate very technical things in a way people understand has been really fulfilling, and I’ve always felt like a welcome voice in the room.

Content from our partners
Powering AI’s potential: turning promise into reality
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline

Working with the tech community (IWF member companies include the likes of Apple, Google and Microsoft, as well as some of the biggest names in telecoms and social media), governmental organisations and law enforcement agencies, the IWF’s analysts receive and assess reports of abuse material online, either submitted by these agencies or from the public via the organisation’s hotline. They then work to remove offending imagery and ensure it is not republished.

To do this effectively, the IWF has to store a large amount of highly sensitive and harrowing images on its systems, and Sexton says that, unsurprisingly, security is a high priority because of this. It also means IWF is not able to join the many organisations which have rushed to embrace cloud computing. “The really sensitive stuff is all kept on-premise, and we have a dedicated, secure air-gap network where all the data and images are stored,” he says. “We also do a lot of work with CAID (the Home Office’s child abuse image database) so have to ensure we follow the same practices they do when it comes to storing and processing large amounts of this type of material.”

The IWF is one of a handful of organisations with permission to store and copy this kind of data. “That’s a massive responsibility and means cybersecurity is a massive part of my role,” Sexton says.

How the IWF’s hashing technology tracks illegal child abuse images

At the core of the IWF’s work sit the analysts who are receiving reports from the public or hunting down child abuse content, Sexton explains. “These guys are taking public reports and going out proactively onto the internet to find websites that host or store child sexual abuse content, assess that content and try to have it taken down,” he says. “My team is looking at what tools we can develop to help them do their job most effectively.”

One way of doing this is through the use of hashes, assigning a unique value to an offending image which can be shared with IWF members. “One of the tools we’ve been developing is called IntelliGrade, which enables automatic assessment and categorisation of images,” the IWF CTO explains. “We generate both perceptual and cryptographic hashes which our members can feed into their automated detection systems to find and automatically block and remove this material.”

This is helping amplify the work of the IWF’s analysts, Sexton says. “We want to make the workflows of our analysts as efficient as possible,” he says. “They’re processing hundreds of thousands of images a month, and using technology enables them to get through those images more quickly as well as gathering more intelligence. I think this is one of the ways we can have a bigger impact fighting this problem – it’s not just about saying ‘these images are illegal’, we’re creating tools which enable the recording of things like estimated age, gender, and type of activity. That way we can create richer datasets which give us more insight on the problem we’re facing.”

Sexton says investing in proprietary technology is paying off for the IWF. “There are some tools out there used by other hotlines and law enforcement agencies,” he says. “But having an internal software development team building tools specifically for our processes means we’re able to quickly change direction as we expand our work and [if we] want to gather more data, we can program that in.

“One big focus at the moment is near-duplicate images,” he continues. “We’ve found that analysts will assess an image but then another might crop up which is almost identical. Rather than making them reassess that, we’re looking at ways we can flag that it’s the same and get the information copied across. Having the ability to do that in-house has been really good because we’ve been able to react quickly to what the analysts need.”

At the moment, development is mainly handled by the IT team, but Sexton says he is ready to give the analysts themselves more opportunities to contribute, using low-code or no-code solutions. “My head of software is keen to put more capability in the hands of the analysts themselves,” he says. “I can see a huge amount of value in that. Some of my staff actually worked as analysts themselves before moving into the tech team, which has been really valuable because they know what the analysts need and it maintains a close relationship between the teams.”

The role of machine learning and AI in tackling online child abuse images

Developing more automated tools to aid the IWF’s analysts is a big focus for Sexton heading into 2022. He adds that the organisation can also play a role in helping third parties develop artificial intelligence tools to combat the problem of abuse images.

“We’re looking at machine learning classifiers which can look at an image and tell what’s in it,” he says. “This could potentially remove some of the human assessment aspects and help our analysts work a bit faster. There’s also the training and testing elements of these models, because we see other safety tech providers developing systems which rely on learning from data sets. That’s a potential future for us because we’re one of the few organisations which hold these kinds of data sets. We’re looking at how we can improve internally but also have a bigger impact externally.”

Indeed, the IWF CTO says he is motivated by the opportunity “to have some social impact and do something beyond yourself” through his work at the IWF. “I think in common with most people who work here, the motivation is being part of that bigger purpose,” Sexton says. “When I was at the university there was an aspect of that, because you’re contributing to society through excellence in teaching and research. But in an IT context that was very much about supporting others in their work.

“IWF is very different in that you can really see the direct impact you’re having, helping the most vulnerable in society by using new and emerging technologies.”

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU