Measuring Storage Performance

Enterprise Storage Forum content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

In the first article in this series, we covered benchmarking in general. Now it’s time to dig in to a specific set of benchmarks that are frequently used in storage.

The Storage Performance Council (SPC) has had a benchmark out for some time called the SPC-1. It has been used by dozens of vendors to highlight their products, as well as by users to compare performance of system against system.

SPC recently added the SPC-2 benchmark to its battery of tests. And it is coming out with further tests this year — SPC-1C and SPC-2C.

Vendors are, of course, big users of such tools. A casual look at the SPC site shows a few vendor names coming up repeatedly, such as Dell, Fujitsu Siemens, Hewlett-Packard, Intel, IBM and Sun Microsystems. These major players are repeated users of SPC-1.

3PARdata, however, only conducted one test in 2004 using SPC-1. So was it disappointed with the results? Geoff Hough, director of product marketing at 3PAR, reports that the company was happy with its scores and intends to post an SPC-2 result in the near future.

“For customers, what makes a benchmark like the SPC valuable is its openness, its full disclosure and its fidelity to real-life workloads,” says Hough. “Customers are empowered to compare results openly and to evaluate the relevancy of results for their own environment.”

End users, too, make use of SPC scores. James Yaple, IT specialist at the U.S. Department of Veterans Affairs (VA) Austin Automation Center, has been using SPC-1 and SPC-2. As well as comparing product performance, he uses these tests to set chargeback rates for his storage customers within a tiered architecture. The basic tier, for example, holds a minimum of 5TB and meets certain benchmarks for less than $10/GB per month. Upper tiers must attain higher scores and are charged back at more expensive rates.

“These benchmarks are based on SPC-1 figures of 100 I/O per sec minimum for the host on the SPC-1 benchmark and an average throughput of 10 MBPS on SPC-2,” says Yaple. “We use the workload characteristics to determine which tier.”

In the mid-tier, the VA is targeting a cost of $20 per GB for 50 TB of storage that meets 250 IOPS on SPC-1 and 25 MBPS on SPC-2. For its premium tier, the VA intends to keep the costs under $125 per GB per month for 50 TB of storage that offers 500 IOPS on SPC-1 and 25 MBPS on SPC-2.

Yaple ran the Apple Xserve RAID box through its paces on these tests. It contained 14 x 500 GB ATA drives and linked to a Dell PT 2650 server with dual Emulex 9000 HBA. He found that the machine excelled at large I/Os, but that wasn’t the types of workloads commonly in play at the VA.

Navigating the SPC Tests

SPC itself is a non-profit corporation founded to define, standardize and promote storage subsystem benchmarks as well as to disseminate objective, verifiable performance data to the computer industry and its customers. Its member roster reads like a who’s who of storage.

“SPC-1 and SPC-2 provide storage product vendors the ability to utilize industry-standard workloads for internal performance engineering measurements,” says Walter Baker, SPC administrator and auditor. “Vendors can then utilize the same industry-standard workloads to produce SPC results, which will allow them to publicly differentiate their storage products based on performance and price-performance.”

The organization currently puts out a couple of tests. SPC-1 consists of a single workload designed to demonstrate the performance of a storage subsystem while performing the typical functions of business-critical applications. Those applications are characterized by predominately random I/O operations and require both queries as well as update operations. Examples of those types of applications include OLTP, database operations and mail server implementations.

SPC-2, on the other hand, consists of three distinct workloads designed to demonstrate the performance of a storage subsystem during the execution of business-critical applications that require the large-scale sequential movement of data. Those applications are characterized predominately by large I/Os organized into one or more concurrent sequential patterns. These workloads include the processing of large files, large database queries and video on demand.

“The SPC-1 Toolkit has been purchased by some number of non-members for internal measurements, and we anticipate that to also be the case when the SPC-2 Toolkit is released for non-member purchase,” says Baker. “Those internal measurements should not be confused nor compared with audited SPC measurements submitted to the SPC for release as SPC Results, which appear on the SPC Web site.”

Such audited results include a publicly available full disclosure report (FDR). In addition, they have successfully completed an SPC Peer Review during which issues of compliance may be raised concerning the result. Thus, there are different types of SPC result that end users should be aware of. As well as internal tests which are not verified, there are several others issued by SPC.

Baker notes that there are actually three related entities, which are referenced with SPC results: Tested Storage Product (TSP), Tested Storage Configuration (TSC), and Priced Storage Configuration (PSC). The relationship and differences between the three are best illustrated using an example such as IBM’s SAN Volume Controller.

The SAN Volume Controller would be the Tested Storage Product (TSP) as designated by the SPC benchmark specification. It is the focal point of the SPC result and is what will be publicized by IBM.

But the SAN Volume Controller, primarily a storage software product, is not a complete storage configuration in practice and as defined by the SPC specifications. The addition of required storage devices, HBAs, cables, and so on to the SAN Volume Controller creates a complete storage configuration, the SPC Tested Storage Configuration (TSC). That is the configuration used to produce audited SPC measurements submitted to the SPC for release as an SPC result.

In most cases, the TSC is also the configuration that is priced in the full disclosure report and in those cases also designated as the Priced Storage Configuration (PSC). The SPC specifications allow some differences between the TSC and the PSC.

“For example, a 32-port switch may be the only unit available for the TSC in the lab, but only 16 ports were utilized for the benchmark measurements,” says Baker. “The test sponsor, therefore, would be allowed to include a 16-port switch in the PSC listing rather than the 32-port switch as long as there would not be a performance advantage resulting from the substitution.”

More to Come

While SPC-1 is well established and SPC-2 is slowly gaining a following, further tests are planned. We’ll soon see the SPC-1C and SPC-2C specifications, with the initial set of results released at the same time, says Baker.

As opposed to system benchmarks, these take testing down to the individual component level such as disk drives, HBAs and logical volume managers. 1C and 2C correspond to their respective SPC-1 and SPC-2 workloads. While SPC-1 and SPC-2 results demonstrate disk drive performance, for example, in the context of large, complex storage configurations, there is currently no industry-standard benchmark that is specifically focused on the performance of individual disk drives.

In addition, Baker notes that there is discussion to introduce other SPC benchmarks that would: measure the performance impact of remote copy/replication; multi-instance execution of the current SPC benchmarks to measure storage consolidation performance; file system and media/video related benchmarks; and investigation on the appropriateness of including power-related information with the current SPC benchmarks.

Back To Enterprise Storage Forum

Drew Robb
Drew Robb
Drew Robb is a contributing writer for Datamation, Enterprise Storage Forum, eSecurity Planet, Channel Insider, and eWeek. He has been reporting on all areas of IT for more than 25 years. He has a degree from the University of Strathclyde UK (USUK), and lives in the Tampa Bay area of Florida.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends, and analysis.

Latest Articles

15 Software Defined Storage Best Practices

Software Defined Storage (SDS) enables the use of commodity storage hardware. Learn 15 best practices for SDS implementation.

What is Fibre Channel over Ethernet (FCoE)?

Fibre Channel Over Ethernet (FCoE) is the encapsulation and transmission of Fibre Channel (FC) frames over enhanced Ethernet networks, combining the advantages of Ethernet...

9 Types of Computer Memory Defined (With Use Cases)

Computer memory is a term for all of the types of data storage technology that a computer may use. Learn more about the X types of computer memory.