Data storage backup is easy to do inefficiently. Best practices require some careful thought.
There are an almost infinite ways to run data storage backups incorrectly. But doing it correctly requires following a few tried and true practices. It’s important to adapt t data storage backup best practices so they can function well in the face of rapid technological advancement.
Here are the some tips on how to do just that:
Widen Your Scope
The narrow view is to look no further than making sure your data is backed up: it’s been sent over a wire to some target device and there it sits. That’s fine as long as you never have to recover anything. At that point, you may find you didn’t backup all your files, the backup failed, or the tape cartridge was corrupted. What it takes is a more complete view of the purpose of backup.
“Backup strategies that only focus on making a separate copy of data, once a day, with little visibility into the recoverability of that data are incapable of meeting the needs of the modern business,” said Paul Davis, Director of Product Management, Data Protection, Dell Software. Your end goal isn’t to backup data, it’s to keep your business up and running at all times, and the way to do that is to match your backup to your business.”
Any business that has been around for a decade or more is used to doing physical backups. And when virtualization came along, they often added separate software to take care of backing up virtual machines (VM). Brian Greene, Symantec’s senior director of product management, advises users to lower costs and simplify backup tasks by deploying a single system that can protect your physical and virtual infrastructure.
“This will eliminate managing two backup solutions, paying for two backup programs, running multiple backup jobs, and backing up duplicate data across physical and virtual,” said Greene. “Select backup that is integrated with VMware VAPIs and Microsoft VSS for fast snapshots. This will minimize CPU, memory, and I/O load performance impacts on the virtual host.”
These days most backup products either come with deduplication bundled in or they have a partnership relationship to provide it. And with good reason. Why backup the same PowerPoint 500 times that the boss sent to everyone in the company?
“Data deduplication decreases network traffic, and reduces the disk space required for storing backup files which saves you money on storage costs,” said Greene.
Use an Appliance
As many organizations are either slimming down their IT rosters or forcing them to a whole lot more with existing resources, it can sometimes make sense to channel the whole backup cycle over to a purpose-built backup and deduplication appliance (PBBA). These boxes take care of backup, deduplication and often simplify the recovery process too.
“PBBAs help storage administrators simplify their tasks by reducing the amount of data needed to be backed up and retained, as well as cutting the bandwidth needed to replicate the deduped data over the wire for disaster recovery to another deduplication appliance,” said Casey Burns, Senior Product Marketing Manager, DXi, Cloud and Virtualization Solutions, Quantum.
Establish the Pecking Order
Christophe Bertrand, Vice President of Product Marketing for Data Management, CA Technologies, recommends that users take a hard look at application and system criticality and establish a protection and recovery “pecking order” that also reflects the interdependency of related systems. For example, a mission-critical application may have feeder systems that need to be protected at the same level.
“It’s a tough exercise, but one that allows organizations to truly understand their recovery point and recovery time (RPO and RTO) needs, establish tiers of recovery, and therefore determine what the data protection levels need to be,” said Bertrand. “Not every application or piece of data is born equal.”
Perhaps things were simpler in the beginning. There was a small data center, a few servers and a relatively straightforward backup procedure. But IT infrastructures sprawl over time. You end up with a variety of backup programs, an entirely separate DR system, some kind of replication setup and several other layers of data protection. The problem is that it is usually cobbled together and integration is a real headache.
“Organizations should take a fully coordinated approach when it comes to various data protection technologies,” said Bertrand. “Guaranteeing RPO and RTO is difficult when disparate solutions are partially protecting a mixed or hybrid IT infrastructure.”
Account for Scale
Trying to force-fit legacy solutions into new virtualized environments generally works poorly. It can add cost, increase complexity and increase overall risk, particularly when companies are deploying hundreds of VMs and beyond.
“Many purpose-built solutions run into trouble when users attempt to scale beyond fifty or a hundred VMs, impacting performance,” said Robbie Wright, senior manager of product marketing at CommVault. “Choose backup that understands the scale and performance requirements of the converged virtual infrastructure to avoid unnecessarily expensive, over-designed solutions.”
Provide Granular Recovery
Backup tools in virtual environments need to deliver adequate granular recovery options to support business requirements, especially for Tier-1 applications. As organizations look to build out large private and hybrid cloud environments, the backup strategy must be able to keep pace and recover to a few hours ago, not to last night’s backup.
“Deploy a backup strategy that delivers granular restore options down to the file or object level and does so from a single pass backup operation,” said Wright. “It is also critical that organizations are creating frequent recovery points without impacting production activity.”
Support Multiple Hypervisors
Modern organizations are rarely willing or able to remain on one platform. They are often forced to look at more than one platform due to the cost and benefit profiles of the applications they want to virtualize. A backup strategy, therefore, should support cross-platform data protection, such as VMware and Microsoft Hyper-V. This helps reduce both cost and risk by eliminating the need for multiple point solutions that require complex integration projects.
“This approach enables users to store, relate, classify and search for all data across the enterprise,” said Wright.
Inventory and Optimize
Christopher L. Poelker, Vice President of Enterprise Solutions, FalconStor Software believes a more organized approach to backup is helpful. He said that it is important for an IT department to take an inventory of the existing backup environment, business objectives for RTO and RPO for applications, and the biggest issues it faces from a backup and recovery perspective. This provides IT with the big picture so it can optimize backup to actual needs and the current environment.
“In an optimized IT model that takes advantage of storage virtualization, backup and recovery is typically the largest benefactor from a cost savings perspective,” said Poelker. “Cloud-based services should be reviewed to see if there is financial and operational benefit to moving the backup process to a service provider.”
Photo courtesy of Shutterstock.