10 Data Storage Tips: How to Improve Your Storage

Enterprise Storage Forum content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Better data storage means different things to different people. For some it is all about speed, for others cost is the primary factor. For many it is about coping with soaring data volumes while for some, simplicity and ease of install/use are top-of mind elements.

Whatever your opinion of what better data storage is, here are a few tips on how to improve storage in the coming year.

Put Flash in the Server

Dan McConnell, Executive Director for Dell Storage Product Management, advises enterprises to add far more flash than they have been doing. His view is that this will have a major performance impact, particularly if flash is included all the way down to the storage consumer. 

“The relationship between servers and storage will continue to evolve in the coming year as customers demand the fastest possible access to their most critical applications,” said McConnell. “Organizations are embracing flash technology embedded in servers to increase response time for consumers, specifically those in industries such as healthcare, finance and retail, where instant transactions can define the customer experience.” 

Strategic Flash

But all flash everywhere all the time might not be a wise strategy. George Teixeira, President and CEO, DataCore, pointed out that flash is not cost effective for all workloads and still makes up a very small fraction of the installed storage base overall.

“The industry would have us believe that users will shift 100% to all flash, but it is not practical due to the costs involved, and the large installed base of storage that must be addressed,” said Teixiera. “On the other side of the spectrum are low-cost SATA disk drives that continue to advance and use new technologies like helium to support huge capacities, up to 10 TB per drive, but they are not highly performant and are slow. Software should be used to unify the new world of flash with the existing and still evolving world of disks.”

Use Tape for High-Volume, Long-Term Storage

Just as the mainframe and data center continue to be alive and well despite pronouncements to the contrary, so too has tape remained very much a part of the storage landscape. Peter Faulhaber, President of Fujifilm Recording Media, said that demand for tape storage is higher than ever. That demand is being fueled by unrelenting data growth, tape’s technological advancements, favorable economics, and regulatory requirements to maintain access to data long term. 

“The role tape serves in today’s modern data center is expanding beyond its historical role in data backup to one that includes long-term archiving of enormous quantities of stored data,” said Faulhaber.

More Tiering

The adoption of storage tiering seems to go in waves. Data volumes become unmanageable so tiering becomes popular to reduce the load on high-performance systems. Then processor speeds, better memory and storage performance catch up, allowing people to get sloppy on tiering. For a while they think they can get along fine with one big high-performance pile of storage. But reality is hitting once again, said Faulhaber. Inevitably, data volumes soar, driving the implementation of an increasing number of tiers of storage. 

“There will be an increase in Tier 0 with a tidal wave of flash adoption for the fastest form of storage as well as a multi-tier approach to long-term data, with the rapid adoption of public cloud and an anticipated swift increase in private cloud creation,” said Faulhaber. “Combinations of flash, disk and tape are being used in both public and private clouds to meet custom requirements.

Servers, Not Storage Arrays

Storage arrays have been a key element in the storage business for many years because they hold a lot of capacity. Now, as servers have become more powerful and can hold quite a bit of storage themselves, they are increasingly becoming used as a Server-SAN or a hyper-converged Virtual SAN.

“This has resulted in a portion of the classic storage array market share being eaten up by servers that can handle the storage function,” said Teixeira of DataCore.

For example, Dell PowerEdge and Fujitsu PRIMERGY high-end servers are being combined with software-defined storage to create systems where all of the storage stack can run on the server. Virtual SAN software allows relatively easy setup of these servers.

Skip Cloud Building

Initially because of security concerns and lack of enterprise scalability, private clouds were advocated as a way companies could retain control over their data and be able to manage their storage across the enterprise. But public clouds have grown steadily in sophistication to encompass many more features. This has gotten to the point that anyone thinking about beginning the process of erecting their own private cloud should seriously question the value proposition, said Bob Muglia, CEO of cloud data warehousing company Snowflake Computing.  

He added that the window of opportunity for private cloud is closing fast and 2015 will be the year of the public cloud. Instead of seeing hundreds of private clouds by now, as many have predicted, few have arrived at the vision of enterprise cloud storage providing a high level of automation. 

“There are almost no production-level private clouds within enterprises today as defined as having no person involved in allocating system resources,” said Muglia.  “Amazon, Google and Microsoft are simply too far ahead in maturity, cost, and functionality.  This means the vast majority of companies will aggressively move to public clouds.” 

Use OpenStack

But Tintri CTO Kieran Harty believes that many will press on with private clouds, including service providers. His advice to them is to take advantage of OpenStack’s storage components Cinder and Swift when building your own block and object storage system in a private cloud.

“Service providers will quickly grow to service the private clouds of organizations that don’t want to run their own infrastructure,” said Harty. “Cloud service providers will turn to storage that removes guesswork allowing for precise pricing, quality of service guarantees and happier customers.

More Commodity Storage

Storiant is an example of a service provider stepping in to solve the private cloud storage problem. It harnesses commodity storage to ease the internal pressure on those trying to erect their own private clouds.

“Enterprises want to avoid paying for storage performance they don’t need,” said John Hogan, Vice President of Engineering and Product Management at Storiant. “The market is embracing open solutions involving commodity hardware to a greater extent than ever before.”

DR as a Service

Disaster recovery (DR) has never been cheap. Building a duplicate set of IT infrastructure has meant that many organizations have had to cut corners when it came to DR. Kemal Balioglu, VP of Products, Quorum, advises those struggling with the price tag of end-to-end DR to outsource that service to the cloud as a way to cut costs. DR as a Service (DRaaS) providers can leverage economies of scale by using the same infrastructure among multiple customers who in turn only pay for the service they use.

“Because DRaaS is easy to deploy and manage when compared with traditional non-cloud disaster recovery solutions, we can expect to see a rapid increase in adoption, particularly in the mid-market,” said Balioglu.

Backup the Backup

With so much data being sent to the cloud, Mike Karp, an analyst with Ptak Associates, strongly advises those doing so to back up their cloud backups.

“Keep copies of your most recent backups on premise,” said Karp. “Having the most recent backup to hand means that recoveries of recent data are much faster as they will be recovering over their LANs rather than from the cloud.”

Photo courtesy of Shutterstock.

Drew Robb
Drew Robb
Drew Robb is a contributing writer for Datamation, Enterprise Storage Forum, eSecurity Planet, Channel Insider, and eWeek. He has been reporting on all areas of IT for more than 25 years. He has a degree from the University of Strathclyde UK (USUK), and lives in the Tampa Bay area of Florida.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends, and analysis.

Latest Articles

15 Software Defined Storage Best Practices

Software Defined Storage (SDS) enables the use of commodity storage hardware. Learn 15 best practices for SDS implementation.

What is Fibre Channel over Ethernet (FCoE)?

Fibre Channel Over Ethernet (FCoE) is the encapsulation and transmission of Fibre Channel (FC) frames over enhanced Ethernet networks, combining the advantages of Ethernet...

9 Types of Computer Memory Defined (With Use Cases)

Computer memory is a term for all of the types of data storage technology that a computer may use. Learn more about the X types of computer memory.