Three Macro Trends in Enterprise Data Storage

Enterprise Storage Forum content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

The enterprise storage market is changing rapidly in response to extreme data growth. Storage media capacity and performance, the ubiquitous cloud, compliance in a complex world, and the ability to analyze data for business value are all aspects of industry change.

Among all this development, three defining trends stand out: software-defined storage, hyperconvergence, and artificial intelligence. These three areas have a deep impact on the entire data center and drive significant storage innovations.

1. Software-Defined Storage (SDS)

Software-defined storage is showing up more frequently in storage rollouts. Purchases will increase throughout 2018 as admins tackle storage technology refreshes — and get sticker shock from the price of intelligent storage arrays.

SDS decouples storage intelligence from the underlying storage devices. IT can buy commodity hardware while the software-defined storage layer provides intelligence. Some SDS products also integrate legacy storage into the centralized management interface. The software layer directs policy-driven workload processing, intelligent data movement, load balancing, dedupe, replication, snapshots, and backup.

It’s not about cheap storage – admins still need to spend money for performance and capacity in commodity hardware. But it is cost-effective, enables centralized storage management, and springs IT from storage vendor lock-in. Typical use cases include:

·  Legacy application environments. The business does not intend to retire the application but does not want to use expensive storage for it either. Instead of choosing between high-priced intelligent arrays and low-cost basic infrastructure, they run the application on commodity hardware with intelligent data services.

·  Data lakes. Big data environments house large volumes of data from sources like business applications, machine sensors, and Internet of Things (IoT). These environments can be very expensive to store on high-priced intelligent arrays. SDS provides scale-out storage pools for data lakes with centralized management. Look for SDS offerings that natively support big data frameworks like Hadoop and NoSQL. 

·  Multi-vendor storage environments. Early SDS products lacked centralized management for multiple vendor storage devices under their control. Now most SDS products can integrate different devices under a single dashboard, which saves significant time and resources for IT. Centralized management enables IT to create security, policies, and provisioning pools across a single logical infrastructure instead of different interfaces at the device level. And admins can add components without manually moving data and load balancing.

Here is a list of the Top Software Defined Storage Vendors.

·  Avoid vendor lock-in and future-proof investments.SDS enables companies to add storage devices to the SDS pool and management layer. This lengthens the lifecycle of legacy storage devices and avoids being locked into a single vendor’s storage systems.

Key takeaway: SDS is not an automatic solution for all data storage. If your existing arrays are working well, there is no need to retire them early. Enterprise arrays serving high-transaction primary storage will always have a place, and specialized arrays for specific verticals and use cases are popular. But SDS will serve you well when you need to simplify complex multi-vendor storage, extend the life of legacy components, and carry out a cost-effective technology refresh. We also expect SDS usage to grow, because of its ability to rapidly expand capacity for fast-growing data.

2.   Hyperconverged Infrastructure (HCI)

Converged and hyperconverged systems replace computing silos with combined infrastructure. The architecture depends heavily on virtualization.

The earliest converged infrastructure (CI) systems integrated servers, networks, storage, and hypervisor into a single chassis. The converged system maker and the component manufacturers certify that their hardware and software are compatible with each other. CIs still work well in scale-up virtual environments carrying unique workloads.

Hyperconverged infrastructure (HCI) evolved from CI. Hyperconverged platforms also integrate components and hypervisors into highly scalable systems that scale-out to other virtualization clusters. System management dynamically expands to added clusters and manages network optimization, dedupe, and optimizing clusters for a variety of workloads.

The next evolution of HCI added secondary data storage to hyperconverged platforms. Primary clusters backup to hyperconverged storage, and admins can manage both environments through the central dashboard. Virtualization enables dynamic provisioning and the secondary storage environment, enables admins to apply advanced data services such as indexing and analytics for compliance and security.

HCI’s primary benefits are elasticity, performance, mixed workloads, and simplifying complex computing environments.

·  Elastic workloads. HCI dynamically handles spikes in demand from its converged servers and storage. The system automatically assigns compute cycles or storage processing to workloads when they need it and reassigns resources upon release.

·  Faster storage. Hyperconverged storage locates servers and storage closer together, which lowers latency and pathing problems. Add to that the ability to dynamically assign storage resources, and it eliminates many data center storage headaches.

·  Mixed workloads. HCI primarily served as specialized environments for Tier 1 applications and VDI. They are still popular for these environments, and IT is increasingly adopting them to consolidate and easily scale a variety of workloads.

·  Simplified management. SANs can be very complex to set up. HCI’s virtualized environment administers software-defined storage (SDS) to pool storage and automatically provision them to workloads.

Key takeaway: HCI simplifies data center infrastructure by centralizing compute resources under a single management interface. Unifying resources makes purchasing and management easier, and the scale-up architecture serves both computing and storage requirements. HCI also provides high performance and highly scalable environments for Tier 1 applications, VDI, and mixed workloads.

3.   AI/Machine Learning

At first glance, AI may seem like an odd choice for a defining data storage trend. AI has broad uses outside of the data center, but also has a deep impact on internal big data analysis, data movement automation, and behavior prediction based on digital communication patterns.

·  AI-driven applications that depend on real-time data. AI has a big impact on transportation and logistical supply chains. For example, a transportation company’s AI software data center tracks trucking movement throughout a region. The data center ingests large volumes of real-time data on weather and traffic information and feeds the data to the AI software, which directs drivers to adjust their routes for the fastest and safest trips. Other AI software automatically detects unusual energy patterns to predict events like power surges our power outages. This allows data center admins to mitigate problems before an event causes significant damage. 

·  Machine learning. Machine learning is a subset of AI. The key to machine learning is that once it learns from a knowledge seed set, it can expand its own learning as it operates in its environment. For instance, machine learning enables software toolsets to dynamically balance and optimize data on SSDs to compensate for flash wear-out. Over time, machine learning intelligence improves dynamic data placement without manual intervention This is a big deal in a data center with high rates of data growth and flash investment.

·   Builds analytics capabilities. Machine learning also builds analytical models. Programmers create the initial analytical toolset, which independently adapts and grows as it encounters new data. The result is that the toolset issues and issue increasingly accurate conclusions and results to business users.

·   Pattern recognition. AI recognizes patterns and by extension, recognizes when those patterns are broken. Pattern recognition works across a broad area of business use cases including social semantics and sentiment analysis for social media marketing, and governance and fraud investigations into digital communications.

Key takeaway: Your existing data management tools may already be using AI and machine learning. When you decide to make a separate investment in AI enabled software, then understand your business case and how to achieve ROI with a significant financial investment. For example, sentiment analysis software can raise profits by improving social media marketing. The same technology can also lower risk by analyzing communication trends among radical groups.

Further Reading
Christine Taylor
Christine Taylor
Christine Taylor is a writer and content strategist. She brings technology concepts to vivid life in white papers, ebooks, case studies, blogs, and articles, and is particularly passionate about the explosive potential of B2B storytelling. She also consults with small marketing teams on how to do excellent content strategy and creation with limited resources.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends, and analysis.

Latest Articles

15 Software Defined Storage Best Practices

Software Defined Storage (SDS) enables the use of commodity storage hardware. Learn 15 best practices for SDS implementation.

What is Fibre Channel over Ethernet (FCoE)?

Fibre Channel Over Ethernet (FCoE) is the encapsulation and transmission of Fibre Channel (FC) frames over enhanced Ethernet networks, combining the advantages of Ethernet...

9 Types of Computer Memory Defined (With Use Cases)

Computer memory is a term for all of the types of data storage technology that a computer may use. Learn more about the X types of computer memory.