Open SAN Architecture Management Page 3 -

Open SAN Architecture Management Page 3



Consolidation through a SAN involves creating and sharing among heterogeneous hosts a pool of storage resources that includes switches, arrays, tape libraries, and other storage assets. While improving scalability, availability, and data accessibility, this model also enhances efficiency and reduces management complexity. Also, while simultaneously cutting costs and waste, consolidation solutions help businesses increase efficiency in managing and protecting data.

By combining data from many servers onto extremely available, scalable, and centralized storage systems, consolidation helps companies cut the complexity of their storage environments while boosting the performance and data accessibility end users experience. At the same time, environment costs and floor space are reduced, and service levels can be enhanced.


Data Continuance

The cost of proper planning is not extremely expensive, while the cost of even brief business interruptions can be. By delivering continuous application availability and data accessibility, a data continuance environment provides a means of achieving uninterrupted operations (for business continuity as well as disaster recovery). Based on business value, data continuance solutions enable administrators to start with a thorough classification of applications and data.

The software can assist in identifying critical data assets (databases, applications, and associated file systems) and reporting on their relationships to the storage resources that support them. As a result, in order to meet necessary uptime requirements and application service levels, these solutions provide systems and storage managers with capacity, database, and file-level information that can be optimally managed.


High Performance Computing

High Performance Computing (HPC) is one solution segment that is at the forefront in driving the rapid adoption of SAN solutions. HPC requirements such as high availability data movement and management are now just as important as performance and scalability.

Resource-intensive HPC requires the storage to provide extremely fast, high volume data movement under heavy workloads and also necessitates the ability to share data at high speeds with very low latency among multiple computers, and with minimal disruption of service. To support data-intensive relational databases, data mining, and complex scientific applications, these requirements point to the need for a common, easy-to-use storage platform that can deliver extreme levels of performance and scalability.


Mission Critical Computing

Finally, SAN infrastructure is increasingly being utilized to meet the extreme levels of availability, performance, and connectivity, all as a result of its inherent resiliency, reliability, and performance. This is also required by high-end data centers where mission critical applications and data are deployed and managed.

Page 3 of 4

Previous Page
1 2 3 4
Next Page

Comment and Contribute


(Maximum characters: 1200). You have characters left.



Storage Daily
Don't miss an article. Subscribe to our newsletter below.

Thanks for your registration, follow us on our social networks to keep up-to-date