IT has gone through a series of major shifts over the years. Physical infrastructure to virtual was a huge change in operational patterns and architecture. But virtualization only took IT so far. Virtualizing physical components didn’t bring about cloud-native architectures. That’s where containerization comes in.
Containers are standardized units of software that can be used to package up code as well as the various underlying dependencies. This permits an application to run faster and reliably and be transferred easily from one environment to another.
Docker container images, for example, are lightweight, stand-alone, and executable software packages. They contain code, run time, system tools, system libraries, settings, and anything else required to run an application. Thus, they have quickly become invaluable elements of the developer toolkit.
Why? Containers isolate software from the underlying environment. They also provide security. Although there are some security weaknesses that can crop up in containers, they generally offer improved security as the applications inside containers are provided with strong default isolation capabilities.
See below for the top trends in the containerization market:
Decoupling and Portability
In many ways, containers are similar to VMs. However, a big difference is their relaxed isolation properties. This permits the OS to be shared among several applications. As containers are decoupled from the underlying infrastructure, their portability facilitates easy movement across clouds and OSes, and cloud-native development.
As a result, containers allow developers to bypass much of the labor involved in coding applications as they take advantage of pre-packaged elements that only need code specific to the application needs. Software, then, becomes easier to develop, easier to test, and easier to deploy.
DevOps Needs Containers, Not VMs
The DevOps approach goes hand in hand with containers. According to Red Hat, many
organizations have become enamored by the iterative methodology of DevOps. It is being used now in a great many enterprises, although many have struggled to fully realize its potential. One of the bugs that often comes up is provisioning. An Enterprise Strategy Group (ESG) study found that 67% of IT professionals admitted to being under pressure to accelerate infrastructure provisioning as a way to support developers and line-of-business teams.
One of the points of difficulty with DevOps is that developers tend to fall back on the familiar – running virtual machines (VMs) in the cloud. And that tends to inhibit DevOps progress. Containers provide a more automated platform on which to establish DevOps and take advantage of microservices.
To be successful, DevOps depends upon the ability to automate routine tasks and build standardized environments. Bottom line: DevOps transformation is enhanced by the adoption of a containerized environment running on a cloud-native platform. Hence, the trend of DevOps success being driven by container adoption.
Follow the Hyperscalers
According to IT research firm Omdia, container usage is in heavy use by cloud service providers, especially the large ones such as Google, Amazon, and Microsoft: On average, said Omdia, 45% of software containers in use by Infrastructure as a Service (IaaS) environments are located in data centers owned by cloud providers.
While IaaS and Platform as a Service (PaaS) environments have long been touted by cloud providers as the way to go, their adoption rate has been relatively slow until recently. Containers changed the trend and are one of the big reasons driving IaaS and PaaS.
Deep Learning Workloads
A big trend is the management of artificial intelligence (AI) workloads in containers. Container technologies enable organizations to have isolation, portability, unlimited scalability, and dynamic behavior thus AI infrastructure management becomes more automated, easier, and more business-friendly than before, according to Bin Fan, Vice President of Open Source and Founding Engineer at Alluxio.
“Deep learning workloads are increasingly containerized, further supporting autonomous operations,” said Fan. “To keep up with this trend, organizations can find their AI workloads running on more flexible cloud environments in conjunction with Kubernetes.”
Application and platform developers, then, have been steadily moving away the traditional way of carving out databases and other repositories in which to offload data. Instead, a lot more things can be moved into a containerization platform to simplify design and add functionality.
“Containerization will be used for more stateful workloads, especially for databases and things that really require storage onto Kubernetes and onto containers,” said Al Brown, CTO, Veritone.
As part of this trend, developers are moving away from general cloud services like Amazon Elastic Container Service (ECS) and the Amazon Fargate serverless compute platform over to Kubernetes-based services such as OpenShift which offer more resource efficiency.
“This is moving away from cloud-specific versions of the innovation to more standard versions of Kubernetes as well as OpenShift,” said Brown.