View all newsletters
Receive our newsletter - data, insights and analysis delivered to you

ICO Warns on AI Data Compliance: Publishes New Auditing Framework

"Shifting the processing of personal data to these complex and sometimes opaque systems comes with inherent risks."

By claudia glover

The UK’s data protection watchdog the ICO has unveiled a new AI auditing framework designed to help ensure data protection compliance — warning that running personal data through such “opaque systems” comes with inherent risks.

The framework includes guidance on complying with existing data protection regulations when using machine learning and AI technologies.

The guidance, aimed at Chief Data Officers, risk managers and others involved in architecting AI workloads, comes as the ICO urged organisations to remember that “in the majority of cases”, they are legally required to complete a data protection impact assessment (DPIA) if they are using AI systems that are processing personal data.

The release comes after Computer Business Review revealed that users of AWS’ AI services were opting in by default (many unwittingly) to sharing AI data sets with the cloud heavyweight to help train its algorithms, with that data potentially being moved to regions outside those they specified to run their workloads in.

See Also – How to Stop Sharing Sensitive Content with AWS AI Services

ICO deputy commissioner, Simon McDougall said: “AI offers opportunities that could bring marked improvements for society. But shifting the processing of personal data to these complex and sometimes opaque systems comes with inherent risks.”

Among other key takeaways, the ICO has called on AI users to review their risk management practices, to ensure that personal data is secure in an AI context.

The report notes: “Mitigation of risks must come at the design stage: retrofitting compliance as an end-of-project bolt-on rarely leads to comfortable compliance or practical products. This guidance should accompany that early engagement with compliance, in a way that ultimately benefits the people whose data AI approaches rely on.

Content from our partners
The hidden complexities of deploying AI in your business
When it comes to AI, remember not every problem is a nail
An evolving cybersecurity landscape calls for multi-layered defence strategies

See also:  “Significant Obsolescence Issues”: IBM Lands MOD Extension for Aging UK Air Control System

In a comprehensive report that the ICO notes it will, itself, refer to, the AI audit framework urges organisations to ensure that all movements and storing of personal data are recorded and documented in each location. This allows the security teams handling the data to apply the proper security risk controls and to monitor their effectiveness. This sort of audit trail will also help with accountability and documentation requirements should an audit take place.

Any intermediate files containing personal data, like files that have been compressed for data transfer, should be deleted as soon as they are no longer required. This eliminates any accidental leaking of personal data and boosts overall security.

The simple use of AI conjures up entirely new challenges for risk managers, the ICO notes: “To give a sense of the risks involved, a recent study found the most popular ML development frameworks include up to 887,000 lines of code and rely on 137 external dependencies. Therefore, implementing AI will require changes to an organisation’s software stack (and possibly hardware) that may introduce additional security risks.”

Read the ICO’s AI Audit Framework Report Here

 

Topics in this article : , , , ,
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU