Container technology is in a period of explosive growth, with usage numbers nearly doubling in the last few months and vendors increasingly making it available in their portfolios.

Docker, which has popularised the technology, reported that it has now issued two billion ‘pulls’ of images, up from 1.2 billion in November 2015.

But containerisation technology isn’t new, back in 1979 Unix V7 developed chroot system call, which provided process isolation, but it took untill 2000 for early container technology to be developed. In 2000, FreeBSD Jails developer Derrick Woolworth introduced jails an early container technology.

Google is perhaps the biggest name user of container style technology, it developed Process Containers in 2006. Google’s creation of Borg, a large-scale cluster management software system is attributed with being one of the key factors behind the company’s rapid evolution.

Borg works by parcelling work across Google’s fleet of computer servers, it provides a centrail brain for controlling tasks across the company’s data centres. So the company hasn’t had to build separate clusters of servers for each software system, e.g.: Google Search, Gmail, Google Maps etc.

The work is divided into smaller tasks with Borg sending tasks when it finds free computing resources.

At their most basic this is essentially what containers do, it offers deployable bits of code that are independent so that applications can be built.

Unix and Linux have been using containers for a number of years but the technology’s recent growth can really be attributed to Docker, whose partners include the likes of IBM, Amazon Web Services and Microsoft.

This isn’t to say that Docker is the only company playing in the field, nor that it has perfected the technology.

As mentioned earlier, Google is a major player with technologies, it has contributed to numerous container-related open source projects such as Kubernetes. Microsoft is another big player having added container support with Windows Servers Containers and Azure Container Service for its cloud platform.

AWS has developed a cluster management and scheduling engine for Docker called EC2 Container Services off the back of the popularity of container deployment on the EC2 platform.

Despite these deployments there are areas that vendors have been looking at with the technology that needs to be improving.

Security is one of the areas.

Gunnar Hellekson, director, product management, Red Hat Enterprise Linux and Enterprise Virtualization, and Josh Bressers, senior product manager, security, wrote in a blog post how containers are being fixed in the wake of the glibc vulnerability.

The flaw means that attackers could remotely crash or force the execution of malicious code on machines without the knowledge of the end user.

The blog post points out that the development of container scanners is just a "paper tiger" that may help find a flaw but doesn’t really help to fix it.

Red Hat says: "We provide a certified container registry, the tools for container scanning, and an enterprise-grade, supported platform with security features for container deployments."

This has been a common message that I have come across when reading about container deployments, vendors integrating with Docker and other container technology – typically there is a focus on improving security.

Although there are security improvements to be made, containers hold significant promise for the enterprise, here are three examples of where containers can help.

 

Containers can help with the rise of bring your own devices

The problem is that as BYOD increases it has become more important to separate work from play. This can be done through a secure container that provides authenticated, encrypted areas of a user’s mobile device.

The point of doing this is so that the personal side of a mobile device can be insulated from sensitive corporation information.

More distributed apps

Docker can distribute the application development lifecycle by enabling distributed apps to be portable and more dynamic. This means that those distributed applications are composed from multiple functional, interoperable components.

Basically this is the next generation of application architectures and processes that are designed to support business innovation.

 

More innovation

Continuing on the innovation front, because the developer no longer has to focus on the labour-intensive work of typing the application into hardware and software variables, the developer is free to develop new things.

This is similar to one of the benefits of using a cloud vendor for Infrastructure-as-a-Service. Rather than dealing with the time consuming tasks that don’t add a great deal of value, the developer can do other things.

 

The top three Docker application areas have been seen in test and quality assurance, web facing apps and big data enterprise apps.

Its rising popularity can be linked to the rise of cloud, partly because web-scale players have used the containerisation technology to such good effect.

However, unlike cloud, Docker adoption has been driven primarily from the middle out; nearly 47% of Docker decisions have been made by middle management and 24% by a grassroots effort. Cloud meanwhile started primarily as a top down decision being pushed by CIOs.

Given its growing popularity and the benefits it can deliver to the organisation and developers, the technology is likely to continue on its rise to the top, particularly when it is taken into account the numerous big name vendors that are working on perfecting it.