What is Docker? A Revolutionary Change in Cloud Computing

docker

The world of cloud computing and enterprise software experiences an endless parade of new technologies, languages, and platforms. Many of them fall by the wayside after their fifteen minutes of fame is up.

Docker is different. Originally a proprietary system that started as a project by Solomon Hykes at dotCloud, Docker was shifted to open source in the first part of 2013. Its popularity grew steadily and exploded in 2014. By the middle of 2015, more than 300 million container downloads had occurred (according to Docker’s home page). The project has been “starred” more than 25,000 times on Github, forked more than 6,500 times, and received 1,000 contributions.

Container technology has been around since the days of the mainframe, but it has resurged in the last couple of years due to the increase in virtualization, the decrease in emphasis on operating systems in the cloud, and the firm establishment of Linux.

What Is Docker?

Docker is an open-source system of software containers. Containers help software to run while it is being moved from one environment to another — such as from a developer’s computer to staging to production — and have all of the things that are needed to run a program inside the container itself. The host, runtime, code, operating system, tools, libraries, and other components are all inside an isolated environment. Everything is self-contained, so programmers will not have to worry about what flavor of Linux is being used wherever the application is being deployed at a given time. Simply put, it will work everywhere.

This practice is different from the use of a virtual machine (VM) because there is no concern about what operating system is being used. Docker makes it possible to virtualize the operating system itself along with the application and every associated component. This makes the entire container:

  • Highly portable and able to run anywhere
  • Lightweight and highly scalable
  • Cost-efficient because their high-densities allow more containers to be placed on a single machine
  • Easy to deploy in the clouds and on-premise

Shared Operating Systems

Hypervisors, or virtual machine managers, do emulate actual hardware systems, but they also have a lot of system requirements. Containers do not put heavy loads on system resources because they use shared operating systems. Docker also shortens the life cycle of development because programmers can create applications with many different coding languages on a wide variety of stacks.

In addition, virtual machines have applications, operating systems, and related libraries and binaries that can together grow to many gigabytes in size. With Docker, containers share a Linux kernel, but everything else is packed inside each container — making them much more efficient and portable. They ramp up immediately and use RAM more effectively. Disk I/O and image management are also more efficient. All of these factors explain why Docker is yet another example of the increasing popularity of open source software over proprietary software.

We at Logz.io run our entire system in Docker containers that allow all of our developers to run our entire system on their laptops and test as if it were in production. Using Docker in production also helps us with continuous deployment by allowing us to deploy separate components of systems independently of one another.

Containers vs. Virtual Machines

Docker is so popular that some industry analysts are talking about possibly doing away with VMs altogether in favor of container technologies such as Docker and LXC. However, using Docker or not depends on the specific needs of a particular project. Docker is not always the answer.

Containers work best when a specific application to handle an individual task is needed. VMs have more of a macro view and can run on most modern operating systems. Containers are the answer in some situations while VMs are the answer in others — and, in some cases, both should be deployed at the same time. Cloud architects need to understand when and where each type of deployment is best for their organizations.

Containers are efficient because they share the same operating systems. Some organizations, however, like hypervisors instead of containers for the very reason that they can have many operating systems. Other complaints about Docker include:

  • Builds and deployments are slow and unpredictable
  • Some Docker supporters advocate the putting of data into containers, a practice that would make it difficult to backup or clone the data
  • Docker is not developer-friendly
  • There is a high learning curve for Linux novices
  • It has a touchy and unpredictable command-line interface
  • The security is weaker than VMs

When a technology like Docker skyrockets in popularity, there is a tendency to want to use it in every potential use case. In reality, Docker should be used only when it is the best solution to the specific problem at hand.

The Rise of Microservices

Docker is part of a broad move in computing towards the use of microservices. Microservices are small, fast software applications that run in self-contained units. The service boundaries of each component make it easy to create modular systems. Microservices are built around products rather than projects. Instead of viewing software as a project to be completed and handed off, the product approach increases collaboration and ownership between developers and stakeholders. In addition, microservices are designed to fail — meaning that they are created without regard to quality so that systems can continue to run whenever individual components fail.

Built to Last

While Docker the platform is open source, Docker the company generates revenue from offering support and services and quickly raised $15 million from Greylock Partners in January of 2014 in a Series B investing round. To date, they have attracted $180 million in five separate investing rounds from well-known firms including Goldman Sachs and Insight Venture Partners. Silicon Valley is behind the Docker phenomenon.

Docker is a computing evolution that will not lose steam anytime soon. The system of software containers comes at a unique time when virtualization, cloud computing, more efficient data processing, and faster application development and deployment are all increasingly needed. Docker began life as a bright shooting star but has evolved into a solid, sensible solution that is built to last for years to come.

Get started for free

Completely free for 14 days, no strings attached.