NAS (Network Attached Storage) devices are rapidly gaining popularity, particularly amongst SMEs, as a cost-effective, scalable way to meet the data storage needs of an organisation. NAS offers many attractive benefits, including faster data access and relatively low administration costs. The technology is fairly simple and robust, and replaces more complex systems of fileservers that require considerably greater onsite technical expertise to manage properly.
Robert Winter, chief engineer, Kroll Ontrack Data Recovery
So it’s little surprise that the NAS market is growing quickly. Many manufacturers have entered the market, each with its own proprietary file system for managing the data stored – and this is where the hidden danger lies, as non-standard file systems unique to the manufacturer are more costly and time-consuming to recover from in emergency situations.
In addition, a global study by Kroll Ontrack found that 40 percent of respondents identified human error as the most common cause of data loss, so it is important that employees understand best practices when it comes to the safe storage of company data on NAS. One of the key selling points of NAS is that it doesn’t need an on-site storage specialist, but as a consequence, errors can arise due to a lack of expertise or experience. Managing the data volumes would normally be the responsibility of a data storage specialist who fully understands the requirements for maintaining a NAS virtualised environment.
In my experience, I have seen mistakes made relating to the management of the data volumes and virtual disks on the NAS devices. The most common cause involves the deletion or accidental formatting of virtual disks, which is complicated by the absence of valid back-ups.
Also, the value of data tends to be comparatively much higher on NAS than on workstations or mobile devices, especially when organisations virtualise their critical applications on NAS hardware. It’s therefore important to appreciate that if and when data loss occurs from NAS, it’s not always a simple job to recover it.
Unfortunately, many people are unaware of this risk until the worst happens, and precious data becomes irretrievable, resulting in damage to the business. Therefore, it is important that organisations using NAS plan for the safe and secure storage of their data and include data recovery in their disaster planning.
So why exactly is it difficult to recover from NAS?
- Proprietary file systems mean there is no ‘quick’ fix. Bespoke recovery techniques, often involving reverse-engineering the file-system, are often required for each manufacturer’s product in order to recover the data.
- The ease-of-use and scalability of NAS mean that it is often more cost effective in the short term to bolt on additional storage rather than managing data properly, where it would be integrated effectively and securely with the existing network. The long-term trade-off is that in any data recovery situation, restoring large volumes of data increases recovery time.
- The new proprietary file systems found in NAS are less established and therefore less stable and reliable, potentially causing more data loss than established file systems.
- People are often backing up data using ‘tried and tested’ methods that do not work well with virtualised environments, meaning that restoration of lost data from backups may either be impossible or unacceptably slow.
NAS clearly has some excellent benefits and as a result is claiming a growing share of the storage market. NAS’s flexibility and price will ensure it continues to be a popular storage method for many businesses. Importantly, it doesn’t have to be problematic if you take the appropriate safeguards and handle the technology correctly.
Don’t bury your head in the sand by failing to adapt data management practices to the specifications of NAS technology or the value of the data it is storing.