Storage Virtualization Has Room to Grow

Enterprise Storage Forum content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

SAN JOSE, Calif. — Server virtualization, the consolidation of physical servers on one powerful machine, has been growing like gangbusters in major enterprises. So it stands to reason that storage virtualization could fit into that pattern, given the exploding universe of user-generated digital content.

Storage virtualization was the topic of several session tracks here at the IDC Directions conference, the bi-coastal industry briefing hosted by the research firm. The technology is usually used in a storage area network (SAN) to manage a number of storage devices, often as a single device. Storage virtualization abstracts the physical location of the data, thus making it appear to be anywhere on the network.

Data appears in a logical storage space and the virtualization system handles the process of mapping it to the actual physical location. This is very different from server virtualization, which is done primarily to consolidate a number of servers on one machine to improve utilization.

Storage and server consolidation are often done together at a company. IDC’s research found that 85 percent of firms enabling storage virtualization also adopt server virtualization, according to Rick Villars, vice president of storage system research for the firm.

“But they are done in silos, so no one can use the two in combination due to limitations of the hardware and the APIs,” he added.

Storage virtualization is often geared for dynamic storage, disaster recovery and planned downtime, all designed to improve uptime and data availability. But, Villars added, storage virtualization isn’t just for large enterprises any more. Increasingly, it’s becoming a tool for individuals and plenty of consumer-facing applications.

“If you’re Flickr, people expect you to protect their photos forever. If they log in two years from now and they aren’t there, they’re going to be upset. Who’s protecting that?” said Villars.

This is bringing about the rise of what he called “content depots,” like Flickr, YouTube and Google. Villars predicted that sites like those will consume 25 times as much storage space by 2010 as they do now — great news for the likes of EMC.

“We have some companies installing 100 terabytes a day, while others are tearing out a 100 terabytes a week. A petabyte is now the entry point,” he said.

Article courtesy of InternetNews.com

Previous article
Next article

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends, and analysis.

Latest Articles

15 Software Defined Storage Best Practices

Software Defined Storage (SDS) enables the use of commodity storage hardware. Learn 15 best practices for SDS implementation.

What is Fibre Channel over Ethernet (FCoE)?

Fibre Channel Over Ethernet (FCoE) is the encapsulation and transmission of Fibre Channel (FC) frames over enhanced Ethernet networks, combining the advantages of Ethernet...

9 Types of Computer Memory Defined (With Use Cases)

Computer memory is a term for all of the types of data storage technology that a computer may use. Learn more about the X types of computer memory.