Server Computing is evolving. Catalyzed by cloud, every aspect of software, from development to deployment has undergone massive changes.
Let’s talk about deployment, which happens to be a difficult and time killing process. We are all familiar with those ugly fights between the developer, testers, and sysadmins arising from code malfunction across their respective environments. The code written by the developer would run on their environment but malfunction on the tester’s or sysadmin’s environment. This makes shipping a tedious task engaging a lot of the workforce, time and hence money.
Even today, if you are shipping code for a project that has a lot of web apps, you may be doing it in a conventional manner. However, this is changing and container technologies such as Docker is playing a pivotal role in it.
Docker can be defined as an open source project that provides containers as service by virtualizing OS upon a Linux kernel.
Docker has become extremely popular in the current time. While container tech existed since 2005, their true potential has come to light only recently. Fast growing tech companies such as Uber, that require a myriad of microservices and run a plethora of hardware infrastructures, cannot afford to go with the conventional method of code deployment anymore.
For a service like Uber, you may have a lot of working sections which can be, the web front end, the static site, the user database, Analytics database and so on. All of them running separate OS’ and occupying individual hardware.
In addition to it, the mix would also include Load Balancers (LB), Caching servers, software libraries and their respective dependencies. All of this must run alike in their respective environments. But often it does not and hence deployment of an entire codebase generally takes a month or so. This is where Container as a Service tech (CaaS) such as Docker are proving valuable by cutting down that time to a few hours.
What a CaaS such as Docker does is to provide the developers with containers that are nothing but user space instances running parallel to one another. The Linux kernel can run multiple numbers of containers at the same time. In other words, docker containers with individual apps run as an entire system.
Each Docker container may contain an individual app along with its libraries and binaries. Binaries and libraries can also be shared between two applications in a separate container. All of these containers run upon the docker engine which in turn runs on a host Operating System.
In other words, Docker provides OS-level virtualization. As against Virtual Machines (VM) in which each app runs on its own OS, Docker containers without individual OS are extremely efficient that saves disk space by a large degree while increasing efficiency and cutting down shipping time.
Unlike VMs where each app has its own OS, all Docker containers with different apps only have docker. This suggests that Docker containers create a uniform environment across DevOps, eliminating discrepancies and errors while cutting down shipping time to a few hours from days/weeks. This is the reason a lot of companies are currently using Docker within their continuous integration infrastructure.
It is obvious fact that faster approval of code and a faster integration line will clamp down costs. This is a crucial necessity for fast growing startups these days.
Now, does it make sense as to why companies such as Uber and various other who ship frequently, are using docker containers? As more startups develop their infrastructure on similar lines, the demand for container technology and CaaS will only increase.
PayPal engineers Ashish Hunnargukar and Mohit Soni say, “the developers do not really need a staging server provided you can run the same workload on your laptop quickly using a docker container. After that, they can take the same docker container and quickly integrate it into the product development life cycle. So, the developer can take to the production level pretty quickly.”
This means there is no need to ship code to a staging server. Developers can easily see their creation in the laptop itself, killing the long test cycle.
In addition to that, the containers (each being some form of micro service) can be developed and deployed individually as well. We already know that containers save a lot of space as they do not need separate Operating Systems. However, the best thing about containers is their ability to interact with each other using a container Cluster. This is a great and stable way to scale up your infrastructure when you are growing at a very high rate.
The general system architecture is to club a cluster of containers using a container engine or a cluster manager. These help to orchestrate individual containers and align their activity in a way you desire. Examples include Apache Mesos, Fleet, Warden etc.
Let’s look around and go through some use cases and companies using Docker
Google ships over 2 billion docker containers every week to their servers. Google uses Dockers in Google app engine for developers to build custom runtime using node, Ruby, Go, Java and PHP.
PayPal has shifted from a SaaS-based VM to Docker containers. PayPal has scaled their Continuous Integration infrastructure using Docker and other container management tools.
Spotify has 60+ backend services and an ever growing user base making shipping a mammoth task. To speed up their build pipeline consisting of 5000+ servers, they implemented Docker.
Like PayPal, Ebay also employs Docker within their continuous integration process. All containerized applications go directly into CI while they run database driven tests in parallel.
Shopify wanted developers to take ownership of their own code. However, it was not possible as the entire Shopify app is built on Ruby and runs like a giant monolithic app. To solve this, they employed docker containers. Now developers have ownership of their respective containers and the resultant performance too.
BBC News had 10 different Continuous Integration environments which were locked down by nature. As 26000 people worked on these CI environments, BBC had to make it easier for the developers to ship code and deploy it as quickly as possible.
There is a myriad of other companies using Docker at this moment. Most of these tech startups are comprised of multiple web apps that communicate with each other frequently. In such an environment, market forces and competition will always force hundreds of developers to ship often and ship quick. We all know now it is already happening and will probably lead to a growing adoption of containers.
Of some of the implementations we have done so far, our engineers say- it is really cool and fast. The CI now works faster and they are shipping more frequently than ever. Also, now they have lesser sysadmins as enemies.
If your startup also demands co-dependant microservices or frequent access to data among services, it will definitely challenge your pace of shipping and eventually incur higher costs. If you do not want that to happen, maybe it is time you look towards Docker.