Those of you that read my column regularly know I look at things a little differently than some of the big name industry analysts, such as Gartner or IDC, and try to look at the reasons behind the predictions. Most of the articles I read from these two groups and others talk about budgets in terms of how much total hardware will be sold, and much of that is based on the economy. I see the problem for budgeting much differently. In my view the problem is you need to buy terabytes or petabytes of storage capacity – if you are Google maybe tens of petabytes of - and I believe there are going to be some significant technology changes for disk storage that will, in turn, impact your budgets. The way I view the problem, is that you will need to buy a defined amount of storage and that has a cost, and too many bean counters, accountants, university researchers and big industry analysts, just draw straight lines based recent history on density and cost. They make the assumption that they have solved the problem and know what the cost will be. The last time I checked, technology growth, for the most part, does not happen on a straight line. The cost per GB for disk technology flattens compared to the old technology cost per GB when it is introduced. The cost quickly gets more dense and lower and then lines flatten again at the end of the technology lifecycle.
Here is a history of Seagate Enterprise disk drives:
|Year||Size for SCSI/FC/SAS disk in GB||Height of drive||Size in inches||Est. Max. Transfer Rate MB/sec||Seek in msec avg.||Latency (msec)||RPM|
If you graph this information over time, you will get some straight lines for density increases. Over longer periods of time, however, density increases are not linear and the slope of the line changes – cost per GB drops with density increases. If you go back in history and look at this chart in terms of technology you will find that with the introduction of new technology the rate of density increases goes up, and as that technology reaches the limit the density increases slows until a new technology is introduced. We are at that point right now. The current state of the art disk technology uses perpendicular recording and we reached the limit of linear increases back in 2008. The problem is that with the major economic downturn there was little to no investment in future technology and now we have to wait till 2013 or 2014 to get any major changes, according to this report.https://o1.qnsr.com/log/p.gif?;n=203;c=204660765;s=10655;x=7936;f=201812281308090;u=j;z=TIMESTAMP;a=20400368;e=i
The two technologies mentioned in the article – heat assisted magnetic recording (HAMR) and bit patterned media (BPM) – both will provide the potential for significant density increases, which will reduce the overall cost in the storage infrastructure.
These changes will impact both enterprise SAS drives and enterprise and consumer SATA drives. The underlying physical technology and design used in both drive types is the same. Look at the massive growth in SATA density from 2003 – 2008 and the much slower density growth since 2008. How does this impact planning budgets for storage?
Budget Planning and Density
Disk drive purchases are almost always based on how much storage you want, not bandwidth, as storage volume is the driving factor. In most budgetary planning processes, I have seen the cost of storage is based on historical costs and therefore historical density increases. If storage density is not going to grow very much over the next 3 – 4 years this it will have a dramatic impact on cost, given the storage increases that will be required in most environments. This is because:
Storage costs are not going to drop as they have in the past. There will not be large increases in areal density for drives. Some of the fixed costs for SAS/SATA ASICs and disk platters, will not change and therefore costs will not drop as much as they have in the past.
If storage growth continues at the same rate at your site, you will need more disk drives that require more power. In addition, you will require more disk controllers (SAS or SATA) which also require more power.
If storage growth continues at your site at the same rate it is likely that more RAID controllers will be needed for performance, given that controllers generally have a limit on the number of disk drives that each controller can house and address.
Each of these factors will affect the cost of storage and the expected cost reductions. Some might say the answer will be Flash, but both Jeff Layton and I have detailed why this will not be the case in his recent article on Flash density and my take on why SSDs won’t replace traditional spinning disk. Like it or not, hard drives are going to make up the bulk of any large data storage requirement. So how does this all impact budgeting for storage costs?
Storage Budgeting Impacts
If you take the technology into account there are good reasons why costs do not go down in a linear fashion. In addition, anyone doing storage planning needs to consider that the cost of disk drives often accounts for the majority of the cost of the storage infrastructure. Ten years ago when disk drives cost more than $50 per GB, the cost of power was a small part of the overall cost. Today power can cost as much as one-tenth or even one-fifteenth of the overall cost of storage. Because current perpendicular recording technology is at its density limit, adding disk drives can be very costly, given the cascade for all of the things needed to support additional drives. There is no way that I can imagine that the cost per byte of storage will continue to drop on the same slope it was dropping back in 2006 or 2008 because it is not on that slope today. Storage density improvements are going to take far longer until new technology enters the market in 2013 or 2014, and that will increase the cost. Since the biggest single cost item for storage is disk drives, you will need to change your cost models to take into account both the lack of density increases and the additional costs within the infrastructure. The bottom line is that you are going to need to have to plan on budgeting more than you likely though you would need.
Henry Newman, CEO and CTO of Instrumental, Inc., and a regular Enterprise Storage Forum contributor, is an industry consultant with 29 years experience in high-performance computing and storage.
Follow Enterprise Storage Forum on Twitter.