Data. It’s the most important asset your business owns, and it’s growing, fast. Analysts put growth figures for the amount of data being created and stored at anywhere between 60-80% per year. That means that for most companies, the amount of electronic data being stored doubles every 18 months, while analysts say this means a 40-fold increase over the next ten years.

Analyst house Gartner reckons this data explosion will fundamentally change the way IT departments operate. "We are increasingly living, playing and working in a digital world where people will have no alternatives but to become ‘more digital’ with the assets they have available," says Stephen Prentice, vice president and Gartner Fellow. "In 2012, the internet will be 75 times larger than it was in 2002, and if Facebook was a country, it would be the third largest in the world (after China and India). Device and data proliferation is also a reality that cannot be escaped."

The results
So you’ve got data, but how well are you using it? How well are you managing it? How well are you making sense of the data you’ve amassed? To find out how organisations are coping with this unprecedented increase in data, and the data quality challenges that presents, CBR spoke to 302 senior IT decision makers.

An initial look at the responses makes positive reading, but ducking beneath the covers we find some worrying results. Just under 80% of those we spoke to said they believe senior management does understand the impact of poor data quality on an enterprise, with about 15% claiming they don’t and the rest being none too sure.

However, well over half of organisations do not measure the cost of inconsistent, inaccurate or unreliable data. Why is this important? Well, data can be expensive. Inaccurate data can mean poorly targeted advertising campaigns, while increased regulation and legislation means there is now more pressure on ensuring data is accurate and reliable; companies cannot be reactive when it comes to data-quality issues.

Direct correlation
Despite most firms apparently being aware of the value, monetary or otherwise, of data, just under 80% think their company treats it as a strategic asset. There is a direct correlation between the quality of your data and the performance of your business: better data helps you make better decisions.

Speaking to CBR about the survey results, Data Flux, a leading provider of data-management services, indicated that convincing companies of that link has become easier over the past few years, suggesting an improvement in the awareness of data’s strategic value. (You can see more of DataFlux’s reaction to the survey results by visiting our special website.)

Those regulations that were mentioned in regard to whether companies measure the cost of data cropped up again when we asked about data policies. Asked if organisations have a clearly defined data policy, for example regarding adherence of data to business rules, enforcing authentication and access rights to data, compliance with laws and regulations or protection of data assets, just under 70% said they did. Around 30% said they didn’t (see Figure 1 for full results).

DataFlux fig 1

For those that did have clearly defined policies, around half said that legislation and/or compliance was the main driver, followed by risk management, revenue optimisation and operational efficiency, with new technology implementation and cost control bringing up the rear (see Figure 2, below).

DataFlux fig 2

The Information Commissioner’s Office (ICO), of course, now has powers to fine organisations for data breaches, a rule that was introduced around the time CBR conducted the survey. This suggests that getting data in order was high on people’s agenda, but the fact that no-one mentioned cost control as a driver implies the relationship between data and money is still somewhat of a mystery to many.

On a similar theme, we also asked about formal data-governance strategies. The results were surprising: about 45% of respondents said that they did, just a couple of percentage points ahead of those that did not. Again compliance and legislation was the top driver, followed by risk management, operational efficiency, revenue optimisation and cost control (Figure 3, below).

DataFlux fig 3

Too complex
Perhaps more interesting was the reasons given by those without a formal strategy for data governance (see Figure 4, below). About one-quarter said that it was too complex, which is something DataFlux accepts is a barrier for many organisations. The combination of technology, people and processes can often prove too much due to the changes in responsibility, thinking and culture that are required. Data governance is so complex that it’s difficult to simply buy a product that can do it; it needs a genuine collaborative effort between IT and the rest of the business.

DataFlux fig 4

The survey also discovered that IT tends to be responsible for IT governance across the organisation, with just over 40% of respondents falling into this category. Next up was a more alarming result: a shade under 30% of those surveyed claimed that each individual department within the organisation was responsible for its own data-governance initiative (Figure 5, below).

DataFlux fig 5

This again hints at a disconnect between not only IT and the rest of the business, but between different sections of the same organisation. The idea of a data-governance strategy is that it unites data quality, policies, management and more under one roof, offering a uniform approach across the business, not that it is segregated.

This will generally result in a fragmented approach to data usage, meaning a company will not get the full benefit of its data. Fewer than 10% of the senior IT decision makers we spoke to said their organisation had a dedicated team of data stewards or a data-quality centre of excellence.

Having said that, DataFlux says that it is seeing more of these initiatives being driven by the business rather than IT, so perhaps departments funding projects themselves makes sense. However a common ground where there is much more collaboration between the two is likely to be the best way for businesses to approach a data-governance project.

There is a much closer correlation between IT and the rest of the divisions within a business when it comes to funding IT governance projects. Just under 40% said that IT tends to fund the initiatives, with each department being responsible for its own a couple of percentage points back (Figure 6, below).

DataFlux fig 6

Continuing on the compliance theme, 10% of our respondents admitted to suffering an issue related to data quality or loss that impacted reputation or profit in the past 18 months. These events could involve something like a compliance violation but in extreme cases the impact of this, whether it’s reputational or monetary, can put a company out of business.

Just under 60% said that they had reacted to this by putting new processes or technologies in place, leaving a not insignificant 40% that have not reacted at all, which does not bode well for any future compliance issues they may face.

Ongoing initiative
So what else did our survey reveal? To help tackle the data deluge, about 65% of respondents said they had carried out a data cleansing or de-duplication project, with a near-even split between those that were doing it as a one-off and those looking longer-term. This could be considered a concern: as DataFlux points out, data management should be an ongoing initiative rather than a stop-start process. Operational efficiency was the primary driver for data cleansing projects (Figure 7, below).

DataFlux fig 7

Master data management (MDM), which is the idea of pulling all corporate data together in one single, accurate view, seems to be in its early stages if our survey is anything to go by. Just 23% had started an MDM initiative, but where this differs from data cleansing is that the vast majority, about 75%, said that it was part of an ongoing process, suggesting more long-term thinking in this particular field.

Those companies that have invested in a technology to help with data quality and management are generally happy with the impact it had, according to our survey (see Figure 8, below). Around 35% of those we spoke to said the technology had either proven to be valuable or extremely valuable. None said it was not at all valuable, which seems to be a ringing endorsement for data-quality technologies. Just under 40% said they had not invested in technology to solve this problem.

DataFlux fig 8

What could be considered worrying is the question about return on investment (ROI). Around 35% said the technology had delivered ROI, 10% said that it hadn’t and the rest said they didn’t know. Does that suggest it is difficult to measure ROI on this sort of project? As with all projects the key to ROI is in the planning – clear objectives need to be established before implementation begins so the business understand exactly what it expects to get out of a project.

Conclusion
Our survey turned up a number of interesting stats -some good, some slightly worrying. But what is clear from the results and speaking to vendors in this space such as DataFlux is that awareness of the importance of good data practices is rising. Figures from Gartner reflect this trend. The analyst house reckons the data-quality tools market will grow by around 12% every year for the next five years.

 

There’s more!
You can watch DataFlux’s president and CEO Tony Fisher and founder and CTO Scott Gidley discuss the results of the survey with CBR’s Jason Stamper, here. Steve Evans talks to DataFlux’s Colin Rickard about the difficulties of achieving a single customer view here.