Developing an Edge Computing Storage Strategy

Enterprise Storage Forum content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

More and more firms are recognizing the value of the IoT, whether in providing manufacturing insights or simply a better customer experience. Unfortunately, for many firms the IoT also presents major challenges. 

At the moment, the clearest of these relates to scalable data processing. Large IoT networks produce huge amounts of data which must be collected, processed, and stored, all with a low latency. Because of this, it’s becoming increasingly apparent that the “traditional” cloud server model of storing data is quickly becoming obsolete.

The replacement? Edge computing.

Edge computing is one of the most exciting developments in cloud storage for IoT data. It describes a model of data storage and computing in which data are both stored and processed closer to where they are generated and used. Rather than being sent to a distant server, data is stored in interstitial storage media, and as far as possible processed there as well.

If implemented correctly, this model can greatly reduce network latency, and greatly increase efficiency. However, in order to get to this point, organizations will need to think carefully about their edge computing strategy. 

How Many Edges?

Let’s get one common misconception out of the way first – when we talk about “the edge”, we really mean a number of edges. Your computer systems likely interact with the “real world” in many different ways — not just through IoT networks, which might be multiple in themselves, but also with customers and staff.

It’s likely that, eventually, you’ll want to store all of this data in the cloud. However, it might also be too costly, in terms of both time and storage space, to send it directly to a central cloud server. So instead, you can set up a smaller, intermediate server that will perform some processing, send the results back to your IoT devices, and then send a copy of all this to your central cloud servers.

These intermediate servers can be implemented in a variety of ways. In industrial applications, where unexpected latency can have costly effects on manufacturing processes, engineers will set up a dedicated physical server to handle the data relating to a particular part of their IoT network. In less performance-oriented environments, you can make use of the concept of server virtualization, in which one physical server can be worked with as though it is composed of multiple sub-servers.

The point here is that, in order to get the most out of the edge computing model, you will need to identify where you can make performance gains, and where those would be useful. And for most companies running IoT networks, that will mean multiple edge servers managing multiple IoT sub-networks.

Also read: Mapping Out a Hybrid Multicloud Strategy

Moving Beyond the IoT

You’ll notice, at this point, that edge computing doesn’t need to be limited to IoT applications. Though the concept of “the edge” was developed in this context, in some ways it is a new way to think about an old technique — setting up interstitial servers to hold data temporarily during times of high network load.

This used to be done quite a lot in the days when internet connections were erratic and difficult to predict. If staff members were working off-site, their data would be temporarily stored in a local server, and only uploaded to your central data storage cloud when it was certain this could be done safely.

The storage infrastructure behind edge computing works in much the same way, but for IoT networks. Or, potentially, any system which needs to interact directly with the real world, but doesn’t need to store all of its data immediately. However, in use cases outside of the IoT, these systems are generally referred to as a cloud storage gateway solution.

Some even approach a data fabric in their level of sophistication. It is fairly common practice for storage gateways to compress information to kee bandwidth to a minimum, while uploading data to the cloud. And more advanced solutions can even perform heavy processing, and then only send processed data to the cloud for storage.

Scaling Your Architecture

At its core, a move to an edge model is about scaling your IoT network, and potentially your other systems. However, it’s important to realize the edge systems don’t generally reduce the amount of raw data storage you require. Indeed, in many cases edge systems need more total storage space than their “traditional” counterparts, if only because some duplication of data is inevitable in distributed networks.

That said, edge infrastructure can help you to scale in other ways. It can allow you, for instance, to rapidly increase the size and capabilities of your IoT networks. Similarly, moving to an edge model can help you to plan your data storage needs in digital transformation processes, because it will allow you closer control over the way that data flows through your systems.

There are, of course, a number of other considerations to make when moving to this model. You will need to carefully consider the cost of moving to an edge model, and how you will ensure security and compliance across distributed storage networks. But, if done correctly, edge storage and computing could save your organization a lot of time, and ultimately a lot of money.

Read next: Managing Unstructured Data Across Hybrid Architectures

Nahla Davies
Nahla Davies
Nahla Davies is a software developer and writer. Before devoting her work full time to technical writing, she managed—among other intriguing things—to serve as a lead programmer at an Inc. 5,000 experiential branding organization whose clients include Samsung, Time Warner, Netflix, and Sony.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends, and analysis.

Latest Articles

15 Software Defined Storage Best Practices

Software Defined Storage (SDS) enables the use of commodity storage hardware. Learn 15 best practices for SDS implementation.

What is Fibre Channel over Ethernet (FCoE)?

Fibre Channel Over Ethernet (FCoE) is the encapsulation and transmission of Fibre Channel (FC) frames over enhanced Ethernet networks, combining the advantages of Ethernet...

9 Types of Computer Memory Defined (With Use Cases)

Computer memory is a term for all of the types of data storage technology that a computer may use. Learn more about the X types of computer memory.