Public cloud storage is an effective way for businesses to access massive amounts of data storage capacity and advanced storage management capabilities with pay-as-you-go pricing and without massive up-front investments. Today, it’s not uncommon for organizations to use a public cloud provider as a repository for backup and archival data, files used in productivity and collaboration software suites, and many more other use cases.
But any discussion about entrusting your organization’s data to a public cloud or online storage provider requires an understanding of how the public cloud works compared to private clouds and on-premises data storage systems.
Cloud-based storage is available from several providers, including Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform, IBM Cloud – the list goes on. Typically, key characteristics include a multi-tenant architecture and the underlying use of object or block storage and APIs (application programming interfaces) that allow applications to access data over the Internet.
Thanks to the competitive nature of the cloud-computing market, costs can be much lower than buying and maintaining storage arrays and networks. As of this writing, an Amazon S3 (Simple Storage Service) standard plan costs 2.3 cents per gigabyte (GB) for the first 50 terabytes (TB) in certain AWS regions, with prices dropping as the number of terabytes that are uploaded to the service climb. Glacier Storage, a cold storage solution for rarely access data, costs 0.4 cents per gigabyte.
Organizations may want to be mindful of price differences between different cloud data center regions and the transfer fees that may be incurred to get a fuller picture of the true cost of placing their data in the hands of a public cloud provider.
Private cloud, as the term suggest, may borrow some of the characteristics of the public cloud like object storage and an “as-a-service” approach to the delivery of IT resources, but they are fundamentally different.
A business rolling out private cloud services – not to be confused with the virtual private cloud offerings available on AWS, Microsoft Azure and other providers – will still need to furnish and maintain their own storage systems much like they would using traditional architectures. One major draw to this approach is that a businesses can tailor their storage environment to their needs and strictly control its various aspects.
Now, let’s explore some of the benefits and downsides of using the public cloud for your enterprise storage needs.
- Flexible in scalability: Public cloud storage services offer the ultimate in scalability, allowing users to increase or decrease the amount of storage capacity they need practically at a moment’s notice.
- Accessibility: Another big reason public cloud storage is taking off is its ability to provide global access to data. This is a boon for companies with distributed workforces and customers located across the globe. As long as they have internet access, far-flung developer teams and end-users can securely access the data they need to get work done.
- Resource pooling: Public cloud providers can achieve massive economies of scale and pass the savings onto customers, courtesy of the resource-pooling approach they take to deliver IT services; a new server or storage system isn’t spun up each time a new customer signs up or requests more capacity. Cloud environments pool their compute and storage resources, isolating them and allowing for multitenant operations.
- Service providers are in control of data: It may seem like a deal-breaker, but businesses that place their data on the cloud relinquish some measure of control over their data to a service provider. Storage administrators accustomed to tine-tuning their environments to fit the needs of their organization, its applications and users may bristle at the notion of letting a third-party dictate the performance characteristics of their storage services.
- Performance and reliability: Although cloud providers are experts at running large-scale data center operations, they can still suffer equipment failures and other outages. Access to cloud storage can also be affected by bouts of unpredictable data throughput over the internet, which can be mitigated in some cases with dedicated connections like Azure ExpressRoute. That said, most major cloud providers offer enviable uptimes and operate world-class facilities.
- Is the public cloud suitable for your business environment?: Cloud storage is ideal for cloud-native businesses and startups since investments in on-premises storage for their cloud-delivered applications would make little sense. Organizations with rigorous data security, privacy and performance requirements for their applications and in-network storage services will generally keep their data in-house.
Cost is a major factor in using cloud storage. As mentioned earlier, customers pay cloud providers for the resources they use rather than pay to acquire storage systems and media and then fill them up over time until the time inevitably comes for a storage upgrade. On the cloud, customers needn’t worry about a provider’s capacity upgrades or managing aging storage arrays; it’s all baked into the price.
Here’s a look at some major cloud storage providers and some of the services they offer.
- Amazon Web Services: Amazon S3 (Simple Storage Service) is the company’s go-to object storage product for countless companies. Amazon EBS (Elastic Block Storage) provides low-latency block storage for databases, applications analytics engines and other demanding workloads. Amazon Glacier is the company’s low-cost storage for long-term backups, archives and other data that is rarely accessed.
- Microsoft Azure: Microsoft’s cloud-based object storage is called Blob and the company offers a files storage service, simply called Files, which supports the Server Message Block (SMB) protocol. Also on tap is a premium managed Disk Storage offering for I/O-intensive workloads that offers SSD- or HDD-based options. Azure Archive storage can be used to park backups and long-term archives.
- Google Cloud: Sensing a pattern yet? Google offers similar options across four storage classes based on an organization’s workload requirements.
For workloads characterized by high-frequency access to data, there is Multi-regional and Regional cloud storage from the company. For backup and archival storage, Nearline storage is meant for data that is accessed less than once per month while Coldline storage stretches that to less than once a year.
- IBM Cloud: IBM provides both object and block storage solutions, the latter of which is backed by SSD storage that delivers SAN-like durability, claims Big Blue. On the file storage front, the company’s Network File System (NFS) offering is also backed by flash storage capacity. Customers can also select a content delivery network (CDN) storage option that keeps media files and other data closer to their users.
- Possible vendor lock-in?: Vendor lock-in is a real concern. Although cloud vendors won’t technically hold your data hostage, users may incur sky-high data egress and migration fees if they decide to a switch to another provider, dissuading them from making a move.
And as you may have guessed, those migration fees quickly climb the more data a provider’s cloud accumulates over the years. While evaluating an enterprise cloud storage provider, determine the costs of both uploading your data and retrieving it in bulk if the time comes to change providers.
Perform security audits and verify that a cloud provider’s own storage security controls align with your organization’s own security and compliance requirements. Also, make certain their encryption and authentication options are up to snuff.
Fortunately, major cloud providers are fairly forthcoming with their security and compliance disclosures and they generally offer the tools to help customers audit their own environments and ensure that their cloud data is safe.