What Are Containers?

Containers are packaged applications and all related code that allows the application to run in an isolated environment. Though multiple containers can share an operating system or kernel, they run independently of each other.

Some basic container terms include:

  • A container engine, which is software that enables the container to run on the server’s operating system
  • A container image, which is mostly just a file before it’s activated by the container engine, and then becomes the running container
  • Containerization, the practice of creating and running applications in containers on computers or servers

The container image holds both the application and its dependencies, or any other code that it requires to run independently. Multiple containers can run on the same machine and operating system at one time.

Containers are helpful in data centers because they:

  • Are very lightweight, allowing applications to run in an isolated environment and be transported easily between hardware
  • Are extremely portable, making workload management easier and expanding the hardware on which an application can run. Workloads in data centers must be increasingly flexible and move to different servers when needed, which containers enable.
  • Are fast at initial run time, rather than taking a long time to boot.

Differences and similarities between containerization and virtualization

Containerization and virtualization, two primary data processing technologies, accomplish similar tasks but abstract different aspects of the data center. Virtual machines reduce the amount of hardware that is needed to run software by increasing how many operating systems can exist on one machine. A hypervisor sits atop the server, and each virtual machine runs atop the hypervisor.

Virtualization makes provisioning and setting up new hardware easier. Virtual machines’ average size runs in the gigabyte (GB) size range.

Containers do not run on hypervisors. Instead, they require a container engine. Containers abstract applications from the servers on which they run. This allows containers to run on a greater variety of machines within a data center. Containers are portable and flexible because they transport applications, along with their dependencies and code, to whatever server a workload is best suited for at the time.

Containers are also more lightweight than virtual machines. Their average size is measured in megabytes (MB) rather than GB. Servers can typically hold more containers than virtual machines.

Containers have a faster overall launch time than virtual machines, an advantage in speed-conscious enterprises.

The Open Container Initiative (OCI)

The Open Container Initiative, known as the OCI, sets standards for what size and specifications containers should have, with the intent of continuing to make containers widely usable and helpful. The rising popularity of containers necessitates some guidelines so they continue to be portable across data centers and IT environments.

If containers follow set specifications, data center silos will be reduced, and companies can all use similar technology. These specifications extend to all container images, not just one brand or type. Set standards also helps prevent confusion in the future and extends the usefulness of containers.

Container brands: Docker and Kubernetes

Docker, founded in 2013, is an OCI-compliant container engine that offers highly flexible containerization. Docker can run applications on bare metal servers, which don’t require provisioning or configuring. Docker is the storage industry standard for containerization.

Kubernetes is an open source container management platform originally developed by Google. Within Kubernetes clusters are nodes, which contain pods and containers, and a control plane for managing all the containers. Kubernetes can run in a variety of environments, including different clouds and bare metal servers.

Benefits of Containers for Storage

Enterprises need high-performing, easily portable applications for their most important workloads, but to really succeed, those applications also need access to their arsenals of data. Containers provide portability and flexibility to data centers, but they must also be able to draw on stored data.

This can be accomplished by designing a software-defined environment for storage and creating a software-managed storage pool. If the software-defined storage platform can also run a container engine, containerized applications will be able to use the enterprise data in the storage pool.

IBM FlashSystem uses hot data in storage for its container workloads. Kubernetes Container Storage Interface (CSI) gives containerized apps access to large volumes of data storage. As containerization becomes more popular, top data center players are recognizing the importance of integrating container use with data storage.

Jenna Phipps
Jenna Phipps is a contributor for Enterprise Mobile Today, Webopedia.com, and Enterprise Storage Forum. She writes about information technology security, networking, and data storage. Jenna lives in Nashville, TN.

Latest Articles

Dell Boosts Flexibility, Security of PowerScale Storage Systems

The new challenges of managing unstructured data has inspired the creation of Dell’s PowerScale line up of storage products.

KIOXIA Introduces PCIe 4.0 Storage Class Memory SSDs

The new FL6 Series bridges the gap between DRAM and TLC-based SSDs by leveraging KIOXIA XL-FLASH Memory.

LTO-9 Products Launched by HPE, FujiFilm, Quantum, Spectra Logic

LTO-9 proponents are pointing to tape storage as an effective way for protecting data against the scourge of ransomware.