Top Ten Storage Trends of 2013

Enterprise Storage Forum content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

What are the biggest trends in storage that emerged in 2013? We asked a collection of storage professionals and here are their thoughts about the rapidly evolving storage market.

Flash, Flash and More Flash

Flash is one of those obvious trends in storage of late. But Gary Orenstein, Executive Vice President of Marketing at FusionIO, noted that the discussion of flash in the enterprise has evolved from, “When should I add flash?” to “How should I add flash to my data center?” You can choose from multiple flash deployment options from multiple server vendors. Enterprises of all sizes are able to boost performance while fitting flash into current architectures.

“Many businesses are integrating flash into their IT systems using software that optimizes flash to accelerate in server, virtual, and shared applications,” said Orenstein.

Tiering Solidifies

After a couple of years of hard campaigning and gaining market acceptance, the new world of storage tiering appears to have been firmly established as flash, disk then tape in that order.

“Flash storage dominates Tier 1 IT production with traditional storage for Tier 2 and tape for deep archive,” said Philippe Nicolas, Director of Product Strategy, Scality.

Open Software-Defined Storage

Okay, so the first couple of trends were pretty obvious. Similarly, software-defined storage is a no-brainer. But rather than stating what everyone knows, we asked a couple of sources to look at evolving trends in terms of where the software-defined movement is heading.  

Ranga Rangachari, Vice President and General Manager of Storage at Red Hat, claimed that 2013 represented the maturation of open software-defined storage as a fundamental component of the software-defined datacenter. By using open source software, he said, it is possible to enable greater portability, intelligence, terabyte-scale and convergence of compute and storage.

“Coupled with volume economics by running the software on standards-based commodity hardware, it results in an agile, scalable, loosely coupled environment for cost effective management of unstructured data storage,” said Rangachari. “Whether it’s big data technologies, such as Hadoop, cloud infrastructure such as OpenStack, or open software defined storage, open source is the place to collaborate on the next generation of data and IT technologies.”

Extracting Value

Clearly, 2013 saw a focus on software-defined storage to provide a software virtualization layer that supports a range of commodity storage and servers. Software defined storage offers an improved approach for storage cost reduction compared to traditional silos and appliances. However, as data growth continues apace, delivering reduced storage cost remains important, but that may not be enough anymore.

“It is becoming ever clearer that solutions and technologies that focus on gaining value from data, rather than just reducing storage cost will continue to gain traction,” said Shahbaz Ali, CEO and President of Tarmin. “Organizations will seek to turn data from a cost center into a strategic business enabler, not just to reduce storage TCO, but also to minimize corporate risk and gain increased business agility, revenue growth and competitive advantage.”

Ali expands upon that theme by explaining that reducing TCO has become table stakes to play in the storage market. Businesses are retaining increasing volumes of stored data over longer retention periods, driven by the need to extract further value from their data to gain competitive advantage.

Object Becomes Hot

Until now, object storage has been a somewhat esoteric technology that a few companies championed yet didn’t really catch fire. But 2013 saw several companies offer object storage technology with a standard file interface. Suddenly, object storage is far easier to utilize by anyone who has a lot of storage.

“The biggest trend we have witnessed in 2013 is the move from experimentation to real production deployment of object storage,” said Jerome Lecat, CEO, Scality. “The use of object storage technology is going to grow like an ever-expanding mushroom ring.”

Open Source for Big Data Storage

Another fan of open source is Molly Rector, Executive Vice President Product Management and Worldwide Marketing at Spectra Logic. She noticed a shift in the enterprise looking beyond traditional storage methods and storage vendors and starting to evaluate the benefit of open source options, RESTful/object storage and open long-term storage data formats such as LTFS. The driver, she believes, is the increasing awareness of big data.

“2013 was really the coming out party for what the industry is labeling Big Data,” said Rector. “Big Data means different things to many people in the industry, but it goes beyond just data analytics.”

Convergence

Those working in storage might feel a little squeezed these days. The traditional dividing lines between storage and networking and even storage and compute are getting blurry. This trend towards convergence picked up pace in 2013. “The ability to leverage the same infrastructure to deliver a converged compute and storage infrastructure is gaining customers significant cost savings while also increasing performance of data intensive applications,” said Red Hat’s Rangachari. “Convergence is transforming the role of storage and bringing applications closer to data.”

Recovery, Not Backup

Backup is largely taken for granted these days. With backup windows no longer being the issue they once were, attention has moved on to recovery. And that has seen the arrival of cloud-based backup services, which fall under the umbrella of Recovery as a Service (RaaS).

Gartner believes this market is blossoming. Gartner’s John Morency estimates that RaaS will be worth $564 million in 2013, with a projected growth rate of 21% over the next three years.

Performance Records

The year 2013 may well go down as the year when most storage performance records were broken. Vendor after vendor announced the latest and greatest milestone. Most recently, Permabit announced that it broke the million IOPS inline deduplication barrier with its Albireo Virtual Data Optimizer (VDO) software.

“Today’s enterprises deploy high-end storage systems to increase the performance of mission critical and I/O intensive applications such as databases, virtual servers and virtual desktops,” said Tom Cook, CEO of Permabit. “While deduplication greatly reduces storage costs, the challenge up until now has been developing a deduplication algorithm that could run at the high-end of enterprise speeds.”  

Private Cloud

EMC was one of the big champions of the private cloud. And it appears that the concept is catching hold. According to a report by Technology Business Research (TBR), private cloud adoption has surged. In fact, the research firm believes it will grow to a whopping $69 billion in revenue by 2018.

TBR surveyed 650 cloud end users across the world and found that private cloud pickup was initially spurred by the need to fill gaps left by various public cloud initiatives. However, as organizations become more comfortable building their own internal clouds, this has led to a 29% increase of workloads run on private clouds.

“Private cloud has truly come into its own as a delivery mechanism that customers understand and are using to achieve the benefits of cloud where public options are either not available or viable,” said Allan Krans, an analyst at TBR.

Photo courtesy of Shutterstock.

Drew Robb
Drew Robb
Drew Robb is a contributing writer for Datamation, Enterprise Storage Forum, eSecurity Planet, Channel Insider, and eWeek. He has been reporting on all areas of IT for more than 25 years. He has a degree from the University of Strathclyde UK (USUK), and lives in the Tampa Bay area of Florida.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends, and analysis.

Latest Articles

15 Software Defined Storage Best Practices

Software Defined Storage (SDS) enables the use of commodity storage hardware. Learn 15 best practices for SDS implementation.

What is Fibre Channel over Ethernet (FCoE)?

Fibre Channel Over Ethernet (FCoE) is the encapsulation and transmission of Fibre Channel (FC) frames over enhanced Ethernet networks, combining the advantages of Ethernet...

9 Types of Computer Memory Defined (With Use Cases)

Computer memory is a term for all of the types of data storage technology that a computer may use. Learn more about the X types of computer memory.