5 Top Data Recovery Trends 

Backup is vital. But recovery has become even more so. 

Companies no longer have the faith they once had in backup schedules that were supposed to reassure them that their data would be available to them should they ever need it.

See below for some of the top trends in the data recovery market:  

1. Raised profile of recovery 

Databases and enterprise resource planning (ERP) systems, such as SQL Server, SAP HANA, Oracle, and MaxDB, are at the heart of IT operations for many companies. 

For these essential systems, simply backing up data is not sufficient to meet recovery time objectives (RTOs) and recovery point objectives (RPOs) these systems demand. 

“IT teams are now using clustering solutions that fail over across geographically separated nodes or cloud regions and availability zones to ensure low recovery time and recovery point objectives can be met,” said Ian Allton, solutions architect, SIOS Technology

2. Repatriating cloud storage 

The rush to the cloud was assisted with many special offers, discounts, and free packages. 

And as the promise of low-cost storage has run aground on the shoals of complexity, some are now moving data from the cloud back in-house to be more in control of its recoverability. 

“As the first-year free programs end, IT professionals counting on the cloud to store all their backup and archive data are now looking at moving data back on-premises,” said George Crump, chief product strategist, StorONE

“Storing data in the cloud, especially static data, is expensive.” 

Crump advocates for highly dense, cost-effective storage solutions that can also provide long-term data resiliency. His advice is that IT needs to look for solutions that can support 20 TB+ hard disk drives (HDDs) without suffering through week-long drive recovery efforts, while also providing resilient ransomware storage. 

3. Bigger big data

The amount of data we produce continues to increase. 

The growth of big data — which is now measured in zettabytes to hundreds of zettabytes or more — can no longer be backed up with the approaches that have been used for decades. 

The problem is so difficult that enterprises will often forego proper backup when large amounts of data are involved. When that happens, there is no recovery in the event of loss or corruption.

“When the scales reach billions of files or more and petabytes of data or more, backup no longer works,” said Jason Lohrey, CEO, Arcitecta

“Data resilience must, and will, become an integral part of the file system and data fabrics.” 

4. Eggs in many baskets 

There has been a tendency to put all backup data on tape or all data in the cloud or store all backups on disk. 

But in this age of diversity, there is a growing need for diversity in backup and recovery. 

Indeed, the 3-2-1 system has long advocated this — make three copies of your data, store it on two different kinds of storage media, and keep one off site. Sensible advice. Yet, many have ignored it. 

“Long-term backup will move from entirely tape (A and B copies) to either a combination of tape (A copy) and cloud (B copy) or entirely cloud (A copy only), using the cheapest possible storage,” said Lohrey with Arcitecta. 

“Those concerned with diversity will use a mix of storage technologies and vendors to avoid having all eggs in the one basket. That may mean having the A copy in one cloud and the B copy in another. If one fails, then recovery can occur from the other.”   

5. Metadata grows in importance 

Metadata used to have a relatively minor role in file management and storage. 

But as time goes on, developers are finding more ways to harness metadata to improve performance, enhance searchability and analytics, and provide more sophisticated features for storage, backup, and recovery. 

More recently, metadata has been seen as a way to improve resiliency and drive faster recovery. Modern systems contain far more metadata details than ever. These can be used to cross-reference data, analyze it, search it, and more. 

“There will be a focus on the use of metadata to drive the placement of data copies in order to meet data resilience and recovery objectives,” said Lohrey with Arcitecta.

“Metadata will be used to automatically minimize the overall costs for a given recovery objective.” 

Drew Robb
Drew Robb
Drew Robb has been a full-time professional writer and editor for more than twenty years. He currently works freelance for a number of IT publications, including eSecurity Planet and CIO Insight. He is also the editor-in-chief of an international engineering magazine.

Latest Articles

5 Top Security Assessment Trends in 2022

Think about the amount of information that is available today. It amounts to hundreds of zettabytes.  Yet, the bulk of security attention is aimed at...

5 Top Network Segmentation Trends in 2022

Storage has always used architectures that split large amounts of something into smaller segments.  There are disks, drives, partitions, physical and logical volumes, and logical...

Top Penetration Testing Trends in 2022

Penetration testing is growing in prominence.  Instead of defend, defend, defend against unseen attacks that could come from anywhere, a different view is needed: Look...