Tim Mackey, Senior Technical Evangelist, Black Duck Software.
Tim Mackey, Senior Technical Evangelist, Black Duck Software.

Implementing a production container strategy requires an understanding of the risks inherent in the construction, deployment and scaling of containerized solutions.

The popularity of Docker has created an environment fostering explosive growth in containerized applications, but with explosive growth comes an increase in the potential for security issues.

Implementing a production container strategy requires an understanding of the risks inherent in the construction, deployment and scaling of containerised solutions.

Trusting Container Images

Working with curated Docker images reduces the risk associated with production container usage, and provides a base for enterprise application developers to innovate. Without supporting systems in place vetting the contents of container images, compromised or malicious container images could jeopardize your container deployment.

Limiting the Attack Surface

One easy method to reduce risk in production container deployments is to reduce the attack surface. The container attack surface consists of the Linux host where the containers execute, the contents of the Docker Image used as the building block for the container, and the application itself.

Reducing the attack surface of the Linux host is fairly straightforward. While a general purpose Linux distribution such as Red Hat Enterprise Linux can easily run containers, it also contains a large number of components which aren’t required for container deployments. In order to address this, minimal Linux distributions have been created explicitly to support production container usage. For example, Red Hat Atomic Enterprise Platform comes preconfigured for Docker, and includes security features such as SELinux enabled by default.

Validating Image Contents

Understanding the contents of the Docker Image on an ongoing basis is a key component to minimizing the risks associated with production container usage.

In a standalone Linux environment, tools like OpenSCAP and OVAL data streams provide a means to validate compliance for the Linux host. Using this data with Docker containers can be challenging, but if the minimal Linux host is the Red Hat Atomic Enterprise Platform, an integrated OpenSCAP solution is available.

A default Atomic scan will validate the compliance state of the curated container environment, and the presence of any security issues related to curated binaries in the container. This default scan provides assurances the base image is up to date for any patches.  However, what the default scan doesn’t do is validate the state of any third party binaries or custom applications, which requires a broader data source than a curated registry typically provides.

 

Implementing a Risk-Based Container Management Solution

 

Defining a proactive risk management policy spanning the development phase through deployment is a key component in minimizing the potential for security issues to become deployed problems in containers. Here are four recommended steps:

 

1)   Utilize a Minimal Linux Host

 

Utilizing a minimal Linux host with strong security principles is crucial. Look for a Linux host environment that includes a curated Docker configuration with predefined security templates enabled.

 

2) Define Compliance Policies

 

Define a set of compliance policies covering both the Linux host and the containers deployed on it. These compliance policies should address both industry compliance concerns such as HIPPA or PCI-DSS as well as application policies relating to the container composition.

 

3) Use Curated Docker Images

 

Docker images from a curated source ensure not only that core security issues are addressed in a timely fashion, but also that image content can be trusted.

4) Proactively Scan Docker Images for Vulnerabilities and Dependency Risks

 

Using a Docker image from a curated source minimizes risk, but if open source components are included in the image, a specialized risk management system providing visibility into those components is needed.

Identifying open source components and versions is essential to knowing whether you are using code in your container that contains a known vulnerability. Black Duck has performed thousands software audits to identify open source code and associated licensing or security risks. We inevitably find problematic open source and known open source security vulnerabilities. In a recent example, Black Duck On-Demand audits of 200 commercial applications discovered that over 67% contained open source vulnerabilities

With over 6,000 open source vulnerabilities discovered since the beginning of 2014, organizations need to know what open source code is inside their containers – and ensure it’s up-to-date and secure.

Container technologies like Docker promise to transform the way enterprises develop, deploy, and manage critical applications. But the same challenges around deploying and managing applications also impact container deployments.  Visibility, identification, and tracking, coupled with policy management and orchestration technologies, can provide a solid solution for securely managing your container infrastructure.