Despite the hype surrounding it, the vision of Information Lifecycle Management (ILM) seems to resonate with the bulk of storage end users. Why? Because ILM promises to solve the headaches associated with storage management and provisioning. However, the chasm between vision and reality requires standardization of data and data types.
“Some people incorrectly think that an ILM framework ends with the application,” says Galen Schreck, an analyst at Forrester Research, and the firm’s Storage Networking Industry Association (SNIA) liaison. “The big stumbling block, though, is that we need a classification system for data rather than plain old ‘storage in order to be able to map goals to business processes and storage resources.”
Accordingly, SNIA Data Management Forum (DMF) is tackling the thorny area of metadata standardization. The DMF sponsors two main initiatives: the Data Protection Initiative and the Information Lifecycle Management Initiative. As part of the ILM initiative, SNIA has worked out a complete ILM framework, one it believes offers a roadmap to ILM success.
“Organizations around the world are seeking standards-based solutions for the effective management of digital information,” said Wayne M. Adams, chair of the SNIA board of directors. “The SNIA Data Management Forum is the industry focal point to lead the efforts to define solutions and best practices for addressing information lifecycle management, complying with industry regulations, securing and protecting data, and providing continuous access to data.”
The focal point of the DMF current efforts as regards ILM surrounds metadata – those parts of a file that describe the contents of the file as a whole, its structure, attributes and so on. Another way to define it would be as information that describes information, such as labels, catalogs and descriptions.
Metadata comes in two basic flavors: file, or object level, and application level. But within these categories, there is broad divergence. In a typical IT environment, for example, you have the various elements storing different types of metadata. Nearline storage and tape keep different metadata types.
Similarly, when you move up to the application layer, you find email archivers, file archivers, database archivers, NAS filers and content management programs storing metadata in a non-uniform and sometimes proprietary manner. This makes it impossible to create centralized policies and rules to manage the infrastructure.
In a While Data File
The DMF sees the way forward as the creation of a common information and data services layer that all storage services and applications access. This shared metadata layer, however, would require immense cooperation from the entire storage and hardware industry and that will prove very difficult to achieve.
“Metadata is a form of currency so it won’t be standardized any time soon,” says Schreck. “But if left alone, metadata will continue to evolve separately and that is creating a real mess.”
The challenge in confronting this chaos revolves around the issue of ownership – whoever controls the metadata can make their own platforms stickier, slow down the commoditization of their gear and exert considerable market influence. Rather than simply attempting to introduce a quick standard, therefore, the DMF is taking a different tack. Its priority is to fully understand the scope and implications of the metadata issue, and define the basic vendor-independent constructs that may one day feed into standards.
The bottom line: SNIA sees a long and hard battle ahead on the road to metadata uniformity. It has realized that promulgating a standard in the current climate would be foolhardy. Instead, it is making very sure it has a complete picture of the data landscape before it moves forward.
Surprisingly, another stumbling block may well turn out to be old data classification systems. These are often proprietary or based on other industry/government standards. And they may well prove inadequate when it comes to true ILM functionality.
“Many of the old data systems that evolved prior to ILM may well have to be revised,” says Schreck.
Until that is achieved and metadata uniformity is arrived at, the outlook for ILM looks bleak. This is a primary reason why early tools bearing the ILM moniker require a prohibitive amount of customization and only go a small way towards the total ILM vision. Further, lack of standards is likely to cause companies to continue to buy point products that can’t talk to each other. This is the disease that results in console hopping – having to move from management console to management console in order to navigate the IT infrastructure.
“I’ve seen clients with 114 management tools, so there is no doubt we have to thin them out,” says Schreck.
The DMF, therefore, is working on a twofold strategy: to work from the top at attempting to design an ILM framework and metadata standard that might one day work; and at the same time staying closely in tune with early-day ILM products to see how the market is evolving.
Article originally appeared on EnterpriseITPlanet.com.