Containers: A Primer for Enterprise IT Pros

Stephen Foskett provides an introduction to application containerization at Interop ITX.

Marcia Savage

July 26, 2017

4 Min Read
NetworkComputing logo in a gray background | NetworkComputing

The buzz around containers, particularly the Docker container platform, is hard to avoid. Containerization of applications promises speed and agility, capabilities that are essential in today's fast-paced IT environment. But outside the world of DevOps, containers can still be an unfamiliar technology.

At Interop ITX, Stephen Foskett, organizer of Tech Field Day and proprietor of Gestalt IT, provided some clarity about application containerization. In a presentation entitled, "The Case For Containers," he explained the basics about the technology and what enterprise IT shops can expect from it.

First off, container technology isn't anything new, he said. "The reason we're hearing about it is Docker. They've done a nice job of productizing it."

He explained that containers are similar to virtual machines "except for this whole idea of user space." A container, which uses operating system-level virtualization, has strict boundaries around a limited set of libraries and is custom-designed to run a specific application. That focus on one application is a key differentiator from virtual machines and makes containers important for enterprise IT, he said.

Docker, which launched as an open source project in 2013, "got a lot of things right," Foskett said. For example, Docker Hub makes it easy to locate images, which become containers when users instantiate them. Docker also uses layered storage, which conserves space. At the same time, though, that easy storage can lead to performance issues, he added.

Cattle or pets?

Since cloud technologies began altering the IT landscape, cattle vs. pets has become a common meme. "Many in DevOps will tell you they're [containers] a cattle approach, but they're not really cattle; they're pets," Foskett said.

While containers can be spun up and torn down quickly, the problem is that by default, Docker doesn't actually destroy the container, which can lead to container sprawl. "When you exit a container, the container stays there with the data as you left it," unless manually deleted with the rm command, Foskett said.

"If you run a container and stop it, and the image stays around, someone can easily restart the container and access what you were doing," he said. "That's probably not a problem on your test laptop, but you can't do that if you're engineering a system."

containers

containers-pixabay.jpg

Another sticky issue for enterprises: It can be difficult to know the origin of images in the Docker Hub. "You can't guarantee it's something good," Foskett said. "Many enterprises aren't too thrilled with this concept."

He advised practicing good hygiene when using containers by keeping images simple and using external volume storage to reduce the risk of data exposure. "Then the container itself stays pristine; you don't have data building up in it."

Container benefits

One of the main reasons he's excited, as a system administrator, about containers is that they allow users to specify the entire application environment, Foskett said. A consistent application environment means not having to worry about OS levels, patches, or incompatible applications and utilities

"This is the critical reason containers are going to be relevant in the enterprise data center," he said.

Another container benefit is security, Foskett said. Security breaches often stem from escalation of privileges to utilities and application components, which affects an entire system. Containerized applications don’t contain unused utilities, so there's less exposure to infection.

Foskett said containers also enable scalable application platforms using microservices. Instead of monolithic systems that are hard to scale, enterprises can have containerized applications for specific functions.

Getting started

Foskett advised attendees to start experimenting with Docker and Windows containers. "One of the coolest things about Docker is that it's really easy to try," he said.

A Docker Enterprise Edition is in the works, which will include certified containers and plugins. When you download a container from Docker Hub, "you know it's really going to be what it says it is," he said.

Docker Inc., the company that manages the Docker open source project and the ecosystem around it, has traditionally focused on developers, but has shifted to an enterprise mindset, Foskett said. "They're addressing concerns we have."

While real microservices won't happen for another five to ten years, "the future really is containerized," Foskett told attendees. "This isn't just a fad or a trend, but an important movement in IT that has important benefits to people like you and me."

 

 

 

 

 

About the Author

Marcia Savage

Executive Editor, Network Computing

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights