IBM Tames Big Data by Blending All-Flash Storage and Spectrum Scale Software

Enterprise Storage Forum content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

IBM this week unveiled a new bundled storage and software solution aimed at accelerating big data workloads, helping its enterprise customers glean potentially business-boosting insights with greater speed and more efficient use of their storage capacity.

The Hortonworks Data Platform-certified solution includes the company’s all-flash Elastic Storage Server (ESS) and Spectrum Scale storage services and management software. Combined, they deliver data throughput rates that are 60 percent faster than earlier products, claims IBM. The company based its estimate on the 40GB per second maximum throughput provided by the new ESS GS6S array versus the ESS GS6’s 25GB per second limit.

On the storage services front, IBM’s Spectrum Scale software provides consolidated access to an organization’s various storage pools, enabling it run analytics on the Hadoop and Spark big data processing platforms. It unifies object, file and Hadoop Distributed File System (HDFS) data storage and can scale to exabytes of storage.

The combined solution also helps customers get the most out of their expensive flash storage investments. According to IBM, Elastic Storage Server with Spectrum Scale can reduce storage requirements by more than 55 percent.

Big data analytics is fast becoming a priority for CIOs that are tasked with helping to set a digital transformation strategy for their organizations.

IDC predicts that increasing demand for big data analytics tools will push the market to $150.8 billion this year before reaching a lofty $210 billion in 2020. Last year, IDGE’s Data & Analytics Survey found that a majority of executives (78 percent) expect that big data collection and analysis to fundamentally change the way they conduct business in the next one to three years. Another recent survey from MIT Sloan Management Review Research Report and SAS stated that 57 percent of enterprises said analytics was helping them gain a competitive advantage.

IBM’s customer base is on the same page.

“We’re seeing an increasing appetite on the part of clients to unlock value within their own business data where competitive advantage can be found but too often they face challenges that inhibit ready access to this information,” said Ed Walsh, general manager of IBM Storage and Software Defined Infrastructure, in a statement. “IBM ESS with Spectrum Scale has been developed to make data available by reducing obstacles that prevent users from quickly accessing and processing the growing oceans of differing data types stored within their disparate, isolated storage systems.”

Pedro Hernandez is a contributing editor at Enterprise Storage Forum. Follow him on Twitter @ecoINSITE.

Pedro Hernandez
Pedro Hernandez
Pedro Hernandez is a contributor to Datamation, eWEEK, and the IT Business Edge Network, the network for technology professionals. Previously, he served as a managing editor for the network of IT-related websites and as the Green IT curator for GigaOM Pro.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends, and analysis.

Latest Articles

15 Software Defined Storage Best Practices

Software Defined Storage (SDS) enables the use of commodity storage hardware. Learn 15 best practices for SDS implementation.

What is Fibre Channel over Ethernet (FCoE)?

Fibre Channel Over Ethernet (FCoE) is the encapsulation and transmission of Fibre Channel (FC) frames over enhanced Ethernet networks, combining the advantages of Ethernet...

9 Types of Computer Memory Defined (With Use Cases)

Computer memory is a term for all of the types of data storage technology that a computer may use. Learn more about the X types of computer memory.