GDPR’s first birthday is fast approaching. The new rules marked one of the greatest shake ups in the history of data protection legislation, providing greater security and rights to individuals and charging organisations to uphold much higher standards, writes Mathias Golombek, CTO at Exasol.
The legislation was, and still is, an opportunity for businesses. GDPR is a clarion call for businesses to take data protection, data management, and the use of data, seriously. It’s a call to understand your data, to discover data relationships, develop better data strategies, to improve your understanding of information lifecycles, and to leverage your data to create efficiencies and innovative strategies.
Of course, this is all easier said than done. When, in just a single hour, a major company can generate millions of individual data points, it’s hard to make the most of data, let alone adhere to GDPR guidelines.
Significantly, unused data – or dark data as it is usually referred to – is on the rise like never before. This mountain of data is hard enough to manage and use on its own, but as GDPR dictates that companies must also know exactly where data can be found and require consent for the utilisation and storage of all recognisable personal information, the task becomes overwhelming.
This is something most businesses are struggling with – in fact, 82% of enterprises, to be precise, don’t even know where all their critical data is. That’s according to an Exasol study, which can be read here ‘Moving the Enterprise to Data Analytics’.
In terms of obstacles for success, there are few problems more significant in any data project than an inability to find, access or otherwise fully use the data sources that hold the value a business should be unlocking. But from a compliance perspective, not knowing the nature of your data, or indeed, not knowing where it is at all, has become a serious problem, with serious legal ramifications.
This is where in-memory databases come in. At heart, databases aren’t just the scaffolding for data analytics and business intelligence. Used in the right way, they can be enormously beneficial for data governance, allowing data analysts to better correlate data – both structured and unstructured.
Today, that data is often spread across multiple departmental silos and having the right tools to ingest datasets from a wide variety of applications and platforms without having to rip-and-replace or code complex custom middleware each time data needs to be integrated can make or break a governance strategy.
The correct database can help a company to identify the kind of information they are storing – whether it is personally identifiable information that would fall under GDPR, or more mundane data that doesn’t demand consent. Knowing where that data can be found, who has access, and how the data is being used is fundamental in putting together compliance procedures.
In terms of enacting those procedures, having a database that can monitor data sources, the lengths of archiving periods, and erasure and access of data via subject access and deletion requests is vital for compliance and efficiency. Neither consumers nor a business wants to wait a week for a clerk to action a request in the paperless era.
Even when it comes to security – another area where GDPR tightened up existing regulation – fast-acting databases can also have significant utility. A company that suspects it has been the victim of insider sabotage or remote cyber-attacks could apply analytics across multiple sets of data and correlate with network access logs, with a view to seeking out suspicious patterns.
People have a vital role to play in GDPR compliance, but the humble database, if well designed and easily extended into new data sources, provides a powerful tool in the data protection officer’s arsenal today and in the future.