Download the authoritative guide: Enterprise Data Storage 2018: Optimizing Your Storage Infrastructure
Disk-based systems have expanded in leaps and bounds in the past decade. And now they are large enough and cost-effective enough to deal with massive quantities of data. There are many different options to choose from. Here are a few interesting ones.
Dell DX Object Storage Platform
The rise of unstructured data has clearly given organizations a problem when it comes to data retention. In the eyes of some, a change of address is required.
Derek Gascon, product strategist for object technology at Dell, said that some users have reached or exceeded the limits of traditional file storage and realize they need to do something different.
“Retaining and preserving file-based data on these devices is a major pain point that requires new methods and technologies to address it,” he said. “With petabytes becoming the common order of magnitude and the amount of file-based data elements reaching into the hundreds of millions to billions, IT organizations are actively investigating innovative solutions that help them reduce operational costs and storage complexity.”https://o1.qnsr.com/log/p.gif?;n=203;c=204650394;s=9477;x=7936;f=201801171506010;u=j;z=TIMESTAMP;a=20392931;e=i
Dell’s solution is the DX Object Storage Platform. It was purpose-built for the retention and preservation of data for the long term via a clustered architecture leveraging standard Dell x86 servers. DX clusters virtualize the internal disk capacity in each node (server) and across all servers in the cluster creating a scalable pool of storage. The DX can scale to multiple petabytes and support trillions of objects in a cluster with a 128-bit flat address space. Data is stored in the DX as an object, which is a combination of metadata and file data that are stored together. With policy-based management, the DX enables replication and distribution of objects across distributed sites for disaster recovery. It also provides background integrity, checking for silent corruption to ensure long-term preservation.
“DX is competitively positioned based on its architecture that is similar to the one proven by major public cloud providers, i.e. it is object-based, clustered on standard servers, and its data protection minimizes the need for backup,” said Gascon. “As customers are having to retain authentic data accessible for years and/or decades, simplifying the move to a new platform with a major data migration effort every three years is a significant value.”
Pricing data was not forthcoming. According to Dell, the cost varies based on capacity needs.
“Archive is based on an information management approach and the technology best suited for it is object storage, which itself is an emerging segment yet gaining traction rapidly because of its use in the cloud,” said Gascon. “Combined with the promise of big data and adding value to business intelligence, we’re going to see significant advancements in object storage and the technologies that interact with archived content over the next several years.”
NetApp’s solution to the problems posed by archiving is to provide an additional piece of software to work with the Data ONTAP operating system that is used to run NetApp arrays and NAS filers.
NetApp SnapLock encompasses data integrity and retention, enabling electronic records to be unalterable and rapidly accessible. It also integrates with archive and content management applications such as IBM FileNet, EMC Documentum, Oracle ILM, Open Text ECM Suite, Microsoft SharePoint, and SymantecTM Enterprise VaultTM. And it supports NFS and CIFS protocols, so no proprietary API is needed.
“SnapLock combines NetApp storage with integrated data protection and storage efficiency technologies to store, manage and protect more data using less disk and less effort,” said Nathan Walker, Systems Engineer, Enterprise Solutions, NetApp.
SnapLock is available in two versions: SnapLock Compliance, which is designed to satisfy records retention requirements to archive e-mails, documents, audit information, and other data in an unalterable state for years and SnapLock Enterprise, which allows more administrative flexibility. They are licensed per storage controller, regardless of the amount of data secured with SnapLock.
“NetApp’s Agile Data Infrastructure enabled customers to have the performance and availability levels equivalent to and, in some cases, greater than SAS disk but with the price and capacity attributes of SATA disk,” said Walker. “This allows SnapLock to be affordable and performance-oriented. SnapLock permanence works with NetApp data deduplication, even in compliance mode.”
HP StoreAll X9730
The HP StoreAll 9730 scales to 1.68PB in one unit and can be clustered with other StoreAll systems to hold up to 16PB of unstructured data. HP said it can be installed and running in 10 minutes. The company added that it competes mainly with EMC Isilon and Netapp StorageGrid.
The platform includes HP StoreAll Express Query, an embedded metadata database technology that allows organizations to locate files and perform rapid file system analytics.
Patrick Osborne, Director of Product Management, HP Storage, said that stratification of data within the application workflow is being requested by some customers. They need to archive data from business systems for the purpose of optimizing Tier 1 infrastructure or for e-Discovery and compliance requirements. Some, for example, are growing their footprint of unstructured data and rich media content and need to archive both.
HP StoreAll Storage will be available on Dec. 20 for a starting price of $0.91 per GB. According to Osborne, that includes tiering, snapshots, remote replication, express query, worm, data retention and validation.
“With the rise of big data fueled by digital content, mobile growth and compliance requirements, organizations are experiencing issues with traditional network attached storage (NAS) systems,” said Osborne. “Simple tasks such as storage resource management, placement of data on the appropriate tier of media and long term retention of data are difficult to manage at scale. This trend is forcing users to seek new solutions around the efficient storage of unstructured data.”