If you read the headlines, downtime is a real and material risk to your business. Whether it’s international hackers, massive power outages or just hardware and software that reaches its end of life, IT organizations continue to be challenged by unplanned downtime and data loss. And it’s costing them as much as $1.7 trillion a year.
According to a recent study commissioned by EMC – the Global Data Protection Index – more than half of enterprise organizations reported unplanned downtime, and one-third suffered from data loss in the past year.
So there’s no longer a question of ‘if’ your organization will be struck by unplanned downtime or data loss – it’s a matter of when and how much.
Enterprises fell into four categories based on their adoption and implementation of data protection technologies:
– Just 13% were "leaders" or "adopters" who had advanced data protection strategies and are less likely to encounter an interruption.
– The other 87% were ranked as "evaluators" or "laggards." These are organizations that are behind the curve when it comes to data protection.
The study also illustrate that the U.S., China and the Netherlands lead the rest of the world when it came to data protection maturity. This advantage probably has much to do with IT budgets and a more mature view on the importance of data protection.
What is abundantly clear from the study is that if organizations want to minimize downtime and data loss, they must incorporate data protection into their overall IT strategy.
GETTING STARTED
Establishing a data protection strategy and plan is especially difficult around database administration. Big data and cloud computing are spawning new technologies and applications that require greater access and visibility into data.
There is also greater demand to tightly integrate applications and data to maintain performance. And there’s an abundance of rogue database administrators who deploy their own disaster recovery and backup solutions.
Database ownership is also increasingly blurred as more departments outsource database services to cloud providers. If multiple sales teams have adopted disparate SaaS solutions to manage prospect databases, who owns the protection of that data? The sales teams? The IT department? The service provider? Lack of clarity leads to confusion and more opportunities for cracks to appear in a data protection strategy.
If you are in that 13% of organizations who are already on top of their game – congratulations. For everyone else, here are a few recommendations to help advance you up the readiness scale.
Think of these as the three Cs of data protection for database administrators: Consumption, Continuum and Control.
THE THREE C’S
CONSUMPTION
You need to know and where data is being consumed in order to ensure that data is protected anywhere and everywhere. Primary consumption models include traditional on-premise, virtualized infrastructure, hybrid cloud, and born-in-the-cloud.
If you are adopting a hybrid cloud infrastructure, it’s important that you cover data in both on-premise and cloud environments. The same applies for applications and data that your employees are running over outside applications like Google Docs and Office 365.
Regardless of the companies behind those brands, don’t make the assumption that your data is protected. To develop a data protection strategy that covers all possible means of where and how your data is consumed, you have to take a holistic view of the total data usage.
CONTINUUM
Next, ensure your data is protected across the entire continuum of recovery point objectives (RPO) and recovery time objectives (RTO). That means having a data protection strategy that includes everything from continuous availability to backup to archiving.
CONTROL
Efforts to secure databases commonly result in a hodgepodge of backup and protection solutions from multiple vendors. This can occur through M&As, internal restructuring or rogue administrators who take it upon themselves to select and install a protection solution.
Database owners who implement their own data protection measures inadvertently create accidental architectures. These multiple architectures produce inefficiencies and exposures such as managing multiple licenses and support for multiple vendors.
More importantly though, the more variables you have to monitor across more reporting tools, the easier it becomes to miss something – potentially creating pockets of unprotected data that only come to light when you try to restore information after an incident.
Companies with multiple data protection vendors are more vulnerable to disruptions. Of the companies EMC surveyed, those with three or more vendors lost three times as much data as those who had unified their data protection infrastructure around a single vendor.
If you haven’t already deployed a comprehensive backup and recovery program for your organization, you should, and soon. Data protection needs to be a core pillar of your overall IT strategy, so I recommend a storage and data protection assessment be conducted twice a year to assess the state of health of your IT infrastructure and specifically identify any gaps in your associated data protection strategy.
With data protection, the best defense is a good offense and the time you spend looking at the big picture will save you from bigger headaches further down the road.
Peter Smails is vice president of Product Marketing at EMC.