This week, the UK government shared the text of its proposed replacement for GDPR, now known as the Data Protection and Digital Information (DPDI) Bill. The bill aims to reduce the compliance burden for UK businesses, the government has said, and experts told Tech Monitor that, on balance, it will likely have the desired effect.
But for businesses with customers in Europe, the bill’s divergence from GDPR is likely to make compliance more complex, they warned. And, alongside the government’s new ‘AI rulebook’, it introduces new responsibilities that may require additional technical expertise.
DPDI Bill compliance implications
Formerly known as the Data Reform Bill, the DPDI Bill is the UK government’s proposed replacement for GDPR. The government outlined its plans for the bill in June, and this week published the text of bill in full.
Significant changes in the proposed bill include placing the Information Commissioner’s Office under parliamentary scrutiny, new rules on web cookies and digital identity, and the removal of the need for small businesses to hire data protection officers.
An assessment of the proposed bill by the International Association of Privacy Professionals (IAPP) found that its underlying principles remain consistent with GDPR. “From a compliance perspective, the essential similarities between the two regimes will not cease to exist once the DPDI Bill becomes law,” it found.
The detailed changes will have a mostly positive impact for data protection officers and privacy professionals, says Dr Katharina Koerner, senior fellow at the IAPP, as it will reduce their compliance burden.
However, there are some areas in which the bill would, as proposed, increase compliance requirements, Koerner adds. These include the new provisions around the use of personal data for research and the requirement to notify the ICO of unlawful direct marketing.
The cost of breaches may also increase. For example, the maximum fine for infringing rules on web cookies or direct marketing is currently £500,000 but this would increase to as much as 4% of global annual turnover.
Compliance may be more complex for businesses who have customers in the EU, says Lilian Edwards, professor of law, innovation and society at Newcastle University. Companies who comply with the UK’s proposed news laws may find that they “aren’t legal when selling to EU customers”.
As a result, these businesses may choose to remain compliant with GDPR, even in the UK, says Robert Grosvenor, managing director at professional services firm Alvarez & Marsal’s disputes and investigations practice.
“From an infrastructure and operational standpoint, it may be more cost-effective to retain EU standards across a multinational organisation rather than adopt separate data management practices for UK customers,” he explains.
The proposed bill would replace GDPR’s requirement for organisations to appoint a data protection officer (DPO) with an obligation to instead identify a “senior responsible individual”. Koerner does not believe this will see the elimination of DPOs, however, as many of the measures mandated by the bill would require specialist expertise. “The need for our profession is ever growing,” she says.
The bill also includes legislation to promote the use of digital identity, allowing users to authenticate their identity without paper documents. This is an “impactful development for privacy professionals”, Koerner says. “New technologies such as self-sovereign identity will possibly get adopted and need to be understood and implemented.”
UK compliance rules on artificial intelligence
Along with a new AI rulebook, also unveiled this week, the DPDI Bill proposes new rules for the way in which organisations use artificial intelligence. For example, “automated decision-making is subject to certain safeguards and the processing of special categories of data for the purpose of mitigating algorithmic bias is legitimate,” explains Koerner.
The AI rulebook, meanwhile, requires that organisations appoint an individual with responsibility for AI use. Will this responsibility fall to data protection and privacy officers? That remains to be seen, says Koerner. “There is a very big overlap between responsible AI and privacy, and as a result, governance models are emerging.”
“The person owning the responsible use of AI may report to the chief privacy officer,” she explains. “But they might both report to a third function, like the general counsel. In SMEs, those two roles can potentially merge in one.”
Best practice for responsible AI is to “integrate it into existing governance structures”, says Lee Howells, head of AI at PA Consulting. This governance should be able to draw on functional expertise from across the organisation, he adds, as “such expertise is likely to be held in multiple resources rather than an individual, especially in larger organisations”.
The rulebook requires that the use of AI is explainable. As a result, says Howells, “the entity responsible for understanding why an AI application has advised a given decision needs to be proficient at asking ‘why’; stopping only when the data scientists responsible for developing the AI application can explain incontrovertibly why it is determining the answer it is.”
Whether or not they are given ultimately responsibility for AI, privacy and data protection officers nevertheless need to understand the technology, says Koerner.
“AI is a fast-growing field. Privacy professionals need to constantly learn and educate themselves in this ever-changing AI environment. Besides technological aspects that must be understood, there are many open legal questions.
“With new regulations coming up, the complexity of AI requirements is increasing, as is the importance of cross-interdisciplinary collaboration, teams, and boards.”