Containerizing applications instead of hosting them on virtual machines is a concept that has been trending in the last few years, making container management popular. Docker sits at the heart of this transition, helping organizations seamlessly adopt containerization technology. Recently, Docker use cases can be seen across all industries, regardless of size and nature.
Table of contents
- What is Docker?
- What are Microservices?
- What are Containers?
- What are the benefits of using Docker?
- Companies Powered by Docker
- What’s Next After Docker?
- Container Management Systems
- Conclusion
- FAQs
Docker Use Cases Video
What is Docker?
Docker is a containerization technology that enables developers to package a service into a container along with its dependencies, libraries and operating system. By separating the apps from the infrastructure, Docker allows you to seamlessly deploy and move apps across a variety of environments.
Docker makes it very simple to create and manage containers using the following steps:
- Create a Docker file and add the code
- Build a Docker image based on the Dockerfile
- Create a running instance from the Docker image
- Scale containers on-demand
What are Microservices?
Traditionally, software is developed using a monolithic architecture wherein the entire software is developed as a single entity using a waterfall development method. These monolithic architectures come with size, complexity and scalability challenges. Microservices or microservices architecture addresses these challenges by allowing developers to break down an application into smaller independent units which communicate with each other using REST APIs. Typically, each function can be developed as an independent service, which means each service can function independently without affecting any of the other services. Therefore, organizations can accelerate release cycles, scale operations on-demand and seamlessly make changes to code without application downtimes. Migrating from a monolithic architecture to microservices is a popular Docker use case.
What are Containers?
A container is a notable use case of microservices architecture. A container is a standard unit of software that isolates an application from its underlying infrastructure by packaging it with all dependencies and required resources. Unlike virtual machines, which virtualize hardware layers, containers only virtualize software layers above the OS level. We will discuss more about Container Management and container management tools below.
What are the benefits of using Docker?
Docker has become synonymous with containerization because of its portability and ecosystem. All major cloud providers such as AWS, GCP and Azure have incorporated Docker into the system and also provide support. Therefore, you can seamlessly run Docker containers on any environment including VirtualBox, Rackspace andOpenStack. Scalability is one of the biggest benefits of Docker. By deploying multiple containers on a single host, organizations can significantly reduce operational costs. Moreover, Docker allows you to deploy services on commodity hardware thus eliminating the costs of purchasing expensive servers.
Docker’s underlying motto includes fewer resources and smaller engineering teams. Organizations can therefore perform operations using fewer resources and thereby require less staff to monitor and manage such operations. This means cost savings and more ROI. Docker allows you to instantly create and manage containers with ease, which facilitates faster deployments. The ability to deploy and scale infrastructure using a simple YAML config file makes it easy to use all while offering a faster time to market. Security is prioritized with each isolated container.
You will find the most common Docker use cases below.
Docker Use Cases 1: From Monolith to Microservices Architecture
Gone are the days when software was developed using only a monolith approach (waterfall model) wherein the entire software was developed as a single entity. Although monolith architecture facilitates the building, testing, deploying and horizontal scaling of software, as the application gets bigger, management can become a challenge. Any bug in any function can affect the entire app. Furthermore, making a simple change requires rewriting, testing and deploying the entire application. As such, adopting new technologies isn’t flexible.
On the other hand, Microservices break down the app into multiple independent and modular services which each possess their own database schema and communicate with each other via APIs. The microservices architecture suits the DevOps-enabled infrastructures as it facilitates continuous delivery. By leveraging Docker, organizations can easily incorporate DevOps best practices into the infrastructure allowing them to stay ahead of the competition. Moreover, Docker allows developers to easily share software along with its dependencies with operations teams and ensure that it runs the same way on both ends. For instance, administrators can use the Docker images created by the developers using Dockerfiles to stage and update production environments. As such, the complexity of building and configuring CI/CD pipelines is reduced allowing for a higher level of control over all changes made to the infrastructure. Load balancing configuration becomes easier too.
Docker Use Cases 2: Increased Productivity
In a traditional development environment, the complexity usually lies in defining, building and configuring development environments using manual efforts without delaying the release cycles. The lack of portability causes inconsistent behavior in the apps. Docker allows you to build containerized development environments using Docker images and to easily set up and use the development environment, all while delivering consistent performance throughout its lifecycle. Moreover, it offers seamless support for all tools, frameworks and technologies used in the development environment.
Secondly, Docker environments facilitate automated builds, automated tests and Webhooks. This means you can easily integrate Bitbucket or GitHub repos with the development environment and create automatic builds from the source code and move them into the Docker Repo. A connected workflow between developers and CI/CD tools also means faster releases.
Docker comes with a cloud-managed container registry eliminating the need to manage your own registry, which can get expensive when you scale the underlying infrastructure. Moreover, the complexity in configuration becomes a thing of the past. Implementing role-based access allows people across various teams to securely access Docker images. Also, Slack integration allows teams to seamlessly collaborate and coordinate throughout the product life cycle.
Offering accelerated development, automated workflows and seamless collaboration, there’s no doubt that Docker increases productivity.
Docker Use Cases 3: Infrastructure as Code
The microservice architecture enables you to break down software into multiple service modules allowing you to work individually with each function. While this brings scalability and automation, there’s a catch: it leaves you with hundreds of services to monitor and manage. This is where Infrastructure as Code (IaC) comes to your rescue, enabling you to manage the infrastructure using code. Basically, it allows you to define the provisioning of resources for the infrastructure using config files and convert the infrastructure into software, thereby taking advantage of software best practices such as CI/CD processes, automation, reusability and versioning.
Docker brings IaC into the development phase of the CI/CD pipeline as developers can use Docker-compose to build composite apps using multiple services and ensure that it works consistently across the pipeline. IaC is a typical example of a Docker use case.
Docker Use Cases 4: Multi-Environment Standardization
Docker provides a production parity environment for all its members across the pipeline. Consider an instance wherein a software development team is evolving. When a new member joins the team, each member has to install/update the operating system, database, node, yarn etc. It can take 1-2 days just to get the machines ready. Furthermore, it’s a challenge to ensure that everyone gets the same OS, program versions, database versions, node versions, code editor extensions and configurations.
For instance, if you use two different versions of a library for two different programs, you need to install two versions. In addition, custom environment variables should be specified before you execute these programs. Now, what if you make certain last minute changes to dependencies in the development phase and forget to make those changes in the production?
Docker packages all the required resources into a container and ensures that there are no conflicts between dependencies. Moreover, you can monitor untracked elements that break your environment. Docker standardizes the environment ensuring that containers work similarly throughout the CI/CD pipeline.
Docker Use Cases 5: Loosely Coupled Architecture
Gone are the days of the traditional waterfall software development model. Today, developers, enabled by the cloud and microservices architecture, are breaking applications into smaller units and easily building them as loosely coupled services that communicate with each other via REST APIs. Docker helps developers package each service into a container along with the required resources making it easy to deploy, move and update them.
Telecom industries are leveraging the 5G technology and Docker’s support for software-defined network technology to build loosely coupled architectures. The new 5G technology supports network function virtualization allowing telecoms to virtualize network appliance hardware. As such, they can divide and develop each network function into a service and package it into a container. These containers can be installed on commodity hardware which allows telecoms to eliminate the need for expensive hardware infrastructure thus significantly reducing costs. The fairly recent entrance of public cloud providers into the telecom market has shrunk the profits of telecom operators and ISVs. They can now use Docker to build cost-effective public clouds with the existing infrastructure, thereby turning docker use cases into new revenue streams.
Docker Use Cases 6: For Multi-tenancy
Multi-tenancy is a cloud deployment model wherein a single installed application serves multiple customers with the data of each customer being completely isolated. Software-as-a-Service (SaaS) apps mostly use the multi-tenancy approach.
There are 4 common approaches to a multi-tenancy model:
- Shared database – Isolated Schema: All tenants’ data is stored in a single database in a separate schema for each tenant. The isolation is medium.
- Shared Database – Shared Schema: All tenants’ data is stored in a single database wherein each tenant’s data is identified by a “Foreign Key”. The isolation level is low.
- Isolated database – Shared App Server: The data related to each tenant is stored in a separate database. The isolation level is high.
- Docker-based Isolated tenants: A separate database stores each tenant’s data and each tenant is identified by a new set of containers.
While the tenant data is separated, all of these approaches use the same application server for all tenants. That said, Docker allows for complete isolation wherein each tenant app code runs inside its own container for each tenant.
To do this, organizations can simply convert the app code into a Docker image to run containers and use docker-compose.yaml to define the configuration for multi-container and multi-tenant apps, thus enabling them to run containers for each tenant. A separate Postgres database and a separate app server will be used for each tenant running inside the container. Each tenant will need 2 database servers and 2 app servers. You can route your requests to the right tenant container by adding an NGINX server container.
Docker Use Cases 7: Speed Up Your CI/CD Pipeline Deployments
Unlike monolith applications which take a few minutes to turn on, containers launch within a few seconds seeing as they are lightweight. As such, you can quickly deploy code at lightning speeds or rapidly make changes to codebases and libraries using containers in the CI/CD pipelines. However, it’s important to note that long build times can slow down the CI/CD deployments. This occurs because the CI/CD pipeline must start from scratch every time meaning dependencies must be pulled on each occasion. Luckily, Docker comes with a cache layer that makes it easy to overcome the build issue. That said, it only works on local machines and therefore is not available for remote runner machines.
To solve this issue, use the “—from-cache” command to instruct Docker build to get the cache from the local machine image. If you don’t have the local existing docker image, you can simply create an image and pull it just before the execution of the “Docker build” command. It’s important to note that this method only uses the latest docker image base. Therefore, in order to get the earlier images caching, you should push and pull every docker image based on each stage.
Docker Use Cases 8: Isolated App Infrastructure
One of Docker’s key advantages is its isolated application infrastructure. Each container is packaged with all dependencies, therefore you don’t need to worry about dependency conflicts. You can easily deploy and run multiple applications on one or multiple machines with ease, regardless of the OS, platform and version of the app. Consider an instance wherein two servers are using different versions of the same application. By running these servers in independent containers, you can eliminate dependency issues.
Docker also offers an SSH server for automation and debugging for each isolated container. Seeing as each service/daemon is isolated, it’s easy to monitor applications and resources running inside the isolated container and quickly identify errors. This allows you to run an immutable infrastructure, thereby minimizing any downtimes resulting from infrastructure changes.
Docker Use Cases 9: Portability – Ship any Application Anywhere
Portability is one of the top five Docker use cases. Portability is the ability of a software application to run on any environment regardless of the host OS, plugins or platform. Containers offer portability seeing as they come packaged with all of the resources required to run an application such as the code, system libraries, runtime, libraries and configuration settings. Portability is also measured by the amount of tweaking needed for an application to move to another host environment. For example, Linux containers run on all Linux distributions but sometimes fail to work on Windows environments. Docker offers complete portability allowing you to move an app between various environments without making any significant changes to its configuration. Docker has created a standard for containerization, it’s therefore no surprise that its containers are highly portable. Moreover, Docker containers use the host machine OS kernel and eliminate the need to add the OS. This makes them lightweight and easy to move between different environments.
The foregoing is especially useful when developers want to test an application in various operating systems and analyze the results. Any discrepancies in code will only affect a single container and therefore won’t crash the entire operating system.
Docker Use Cases 10: Hybrid and Multi-cloud Enablement
According to Channel Insider, the top three drivers of Docker adoption in organizations are hybrid clouds, VMware costs and pressure from testing teams. Although hybrid clouds are flexible and allow you to run customized solutions, distributing the load across multiple environments can be a challenge. In order to facilitate seamless movement between clouds, cloud providers usually need to compromise on costs or feature sets. Docker eliminates these interoperability issues seeing as its containers run in the same way in both on-premise and cloud deployments. You can seamlessly move them between testing and production environments or internal clouds built using multiple cloud vendor offerings. Also, the complexity of deployment processes is reduced.
Thanks to Docker, organizations can build hybrid and multi-cloud environments comprising two or more public/private clouds from different vendors. Migrating from AWS to the Azure cloud is easy. Plus, you can select services and distribute them across different clouds based on security protocols and service-level agreements.
Docker Use Cases 11: Reduce IT/Infrastructure Costs
With virtual machines, you need to copy the entire guest operating system. Thankfully, this is not the case with Docker. Docker allows you to provision fewer resources enabling you to run more apps and facilitating efficient optimization of resources. For example, developer teams can consolidate resources onto a single server thus reducing storage costs. Furthermore, Docker comes with high scalability allowing you to provision required resources for a precise moment and automatically scale the infrastructure on-demand. You only pay for the resources you actually use. Moreover, apps running inside Docker deliver the same level of performance across the CI/CD pipeline, from development to testing, staging and production. As such, bugs and errors are minimized. This environment parity enables organizations to manage the infrastructure with minimal staff and technical resources therefore saving considerably on maintenance costs. Basically, Docker enhances productivity which means you don’t need to hire as many developers as you would in a traditional software development environment. Docker also comes with the highest level of security and, most importantly, it’s open-source and free.
Docker Use Cases 12: Security Practices
Docker containers are secure by default. When you create a container using Docker, it will automatically create a set of namespaces and isolate the container. Therefore, a container cannot access or affect processes running inside another container. Similarly, each container gets its own network stack which means it cannot gain privileged access to the network ports, sockets and interfaces of other containers unless certain permissions are granted. In addition to resource accounting and limiting, control groups handle the provisioning of memory, compute and disk I/O resources. Distributed-Denial-of-Service (DDoS) attacks are thus successfully mitigated seeing as a resource-exhausted container cannot crash the system.
When a container launches, the Docker daemon activates a set of restriction capabilities, augmenting the binary root with fine-grained access controls. This provides higher security seeing as a lot of processes that run as root don’t need real root privileges. Therefore, they can operate with lesser privileges. Another important feature is the running signed images using the Docker Content Trust Signature Verification feature defined in the dockerd config file. If you want to add an extra layer of security and harden the Docker containers, SELinux, Apparmor and GRSEC are notable tools that can help you do so.
Docker Use Cases 13: Disaster Recovery
While hybrid and multi-cloud environments offer amazing benefits to organizations, they also pose certain challenges. Maintaining resilience is a notable one. In order to ensure business continuity, your applications must withstand errors and failures without data losses. You can’t afford downtimes when a component fails, especially with critical applications. As such, we recommend that you remove single points of failure using redundant component resiliency and access paths for high availability. Applications should also possess self-healing abilities. Containers can help you in this regard. Nevertheless, for cases where unforeseen failures arise, you need a disaster recovery plan that reduces business impact during human-created or natural failures.
Docker containers can be easily and instantly created or destroyed. When a container fails, it is automatically replaced by another one seeing as containers are built using the Docker images and based on dockerfile configurations. Before moving an image to another environment, you can commit data to existing platforms. You can also restore data in case of a disaster.
All of this being said, it’s important to understand that the underlying hosts may be connected to other components. Therefore, your disaster recovery plan should involve spinning up a replacement host as well. In addition, you should consider things like stateful servers, network and VPN configurations, etc.
Docker Use Cases 14: Easy Infrastructure Scaling
Docker augments the microservices architecture wherein applications are broken down into independent services and packaged into containers. Organizations are taking advantage of microservices and cloud architectures and building distributed applications. Docker enables you to instantly spin up identical containers for an application and horizontally scale the infrastructure. As the number of containers increases, you’ll need to use a container orchestration tool such as Kubernetes or Docker Swarm. These tools come with smart scaling abilities that allow them to automatically scale up the infrastructure on-demand. They also help you optimize costs seeing as they remove the need to run unnecessary containers. It’s important to fine-grain components in order to make orchestration easier. In addition, stateless and disposable components will enable you to monitor and manage the lifecycle of the container with ease.
Docker Use Cases 15: Dependency Management
Isolation of dependencies is the strongest feature of containers. Consider an instance where you have two applications that use different third party libraries. If the applications depend on different versions of the same library, it can be a challenge to keep tabs on the version difference throughout the product life cycle. You may need to allow containers to talk to each other. For instance, an app needs to talk to a database associated with another app. When you move an application to a new machine, you’ll have to remember all of the dependencies. Furthermore, version and package conflicts can be painful.
When trying to reproduce an environment, there are OS, language and package dependencies that should be taken care of. If you work with Python language, you’ll need dependency management tools such as virtualenv, venv and pyenv. If the new environment doesn’t have a tool like git, you’ll need to create a script to install git CLI. The script keeps changing for different OS and OS versions, therefore every team member should be aware of these tools, which isn’t always easy.
Be it OS, language or CLI tool dependencies, Docker is the best tool for dependency management. By simply defining the configuration in the dockerfile along with its dependencies, you can seamlessly move an app to another machine or environment without the need to remember the dependencies, worry about package conflicts or keep track of user preferences and local machine configurations.
Companies Powered by Docker
Docker use cases are not limited by region or industry.
Paypal is a leading US-based financial technology company which offers online payment services across the globe. The company processes around 200 payments per second across three different systems; Paypal, Venmo and Braintree. As such, moving services between different clouds and architectures used to delay deployment and maintenance tasks. Paypal therefore implemented Docker and standardized its apps and operations across the infrastructure. To this day, the company has migrated 700 apps to Docker and works with 4000 software employees managing 200,000 containers and 8+ billion transactions per year while achieving a 50% increase in productivity.
Adobe also uses Docker for containerization tasks. For instance, ColdFusion is an Adobe web programming language and application server that facilitates communication between web apps and backend systems. Adobe uses Docker to containerize and deploy ColdFusion services. It uses Docker Hub and Amazon Elastic Container Registry to host Docker images. Users can therefore pull these images to the local machine and run Docker commands.
GE is one of the few companies that was bold enough to embrace the technology at its embryonic stage and has become a leader over the years. As such, the company operates multiple legacy apps which delay the deployment cycle. GE turned to Docker and has since managed to considerably reduce development to deployment time. Moreover, it is now able to achieve higher application density than VMs, which reduces operational costs.
Legacy Application Modernization: Read the full blog here.
What’s Next After Docker?
Once you understand how Docker is impacting different business aspects, the next thing you want to grasp is how to fully leverage Docker technology. As organization operations evolve, the need for thousands of containers arises. Thankfully, Docker is highly scalable and you can easily scale services up and down while defining the number of replicas needed using the scale.
- $ docker service scale frontend=50
You can also scale multiple services at once using the docker service scale command.
Container Management Systems
As business evolves, organizations need to scale operations on-demand. Furthermore, as container clusters increase, it becomes challenging to orchestrate them. Container management systems help you manage container tasks right from creation and deployment all the way to scaling and destruction, allowing you to use automation wherever applicable. Basically, they simplify container management. In addition to creating and removing containers, these systems manage other container-related tasks such as orchestration, security, scheduling, monitoring, storage, log management, load balancing and network management. According to Datadog, organizations that use container management systems host 11.5 containers per host on average compared to 6.5 containers per host when managed by non-orchestrated environments.
Popular Container Management Tools
Here are some of the most popular container managers for your business.
- Kubernetes: Kubernetes is the most popular container management tool developed by Google. It wasn’t long before Kubernetes became a de facto standard for container management and orchestration. Google moved the tool to Cloud Native Computing Foundation (CNCF), which means the tool is now supported by industry giants such as IBM, Microsoft, Google andRedHat. It enables you to quickly package, test, deploy and manage large clusters of containers with ease. It’s also open-source, cost-effective and cloud-agnostic.
- Amazon EKS: As Kubernetes became a standard for container management cloud providers started to incorporate it into their platform offerings. Amazon Elastic Kubernetes Service (EKS) is a managed Kubernetes service for managing Kubernetes on AWS. With EKS organizations don’t need to install and configure Kubernetes work nodes or planes seeing as it handles that for you. In a nutshell, EKS acts as a container service and manages container orchestration for you. However, EKS only works with AWS cloud.
- Amazon ECS: Amazon Elastic Container Service (ECS) is a fully managed container management tool for AWS environments which helps organizations manage microservices and batch jobs with ease. ECS looks similar to EKS but differs seeing as it manages container clusters, unlike EKS which only performs Kubernetes tasks. ECS is free while EKS charges $0.1 per hour. That said, seeing as it’s open-source, EKS provides you with more support from the community. ECS, on the other hand, is more of a proprietary tool. ECS is mostly useful for people who don’t have extensive DevOps resources or who find Kubernetes to be complex.
Also read: Amazon ECS vs EKS - Amazon Fargate: Amazon Fargate is another container management as a serverless container service that enables organizations to run virtual machines without having to manage servers or container clusters. It’s actually a part of ECS but it also works with EKS. While ECS offers better control over infrastructure, it has some management complexities. If you want to run specific tasks without worrying about infrastructure management, we recommend Fargate.
- Azure Kubernetes Service: Azure Kubernetes Service (AKS) a container management tool that is a fully-managed Kubernetes service offered by Microsoft for Azure environments. It’s open-source and mostly free seeing as you only pay for the associated resources. AKS is integrated with the Azure Active Directory (AD) and offers a higher security level with role-based access controls. It seamlessly integrates with Microsoft solutions and is easy to manage using Azure CLI or the Azure portal.
- Google Kubernetes Service: Google Kubernetes Engine (GKE) is a Kubernetes-managed service developed by Google in 2015 to manage Google compute engine instances running Kubernetes. GKE was the first ever Kubernetes-managed service, followed by AKS and EKS. GKE offers more features and automation than its competitors. Google charges $0.15 per hour per cluster.
Conclusion
In today’s complex software development environments comprising multiple operating systems, programming languages, plugins, frameworks, container management, and architectures, Docker creates a standardized workflow environment for every member throughout the product life cycle. More importantly, Docker is open-source and supported by a strong and vibrant community which is available to help you with any issues. Failing to successfully leverage Docker use cases will surely keep you behind your competitors.
This blog is available on DZone
FAQs
Redis, NGINX and Postgres are the three most widely used technologies running inside Docker containers. Other technologies include MySQL, MongoDB, ElasticSearch, RabbitMQ and HAProxy.
Docker technology is highly scalable. You can easily scale the infrastructure to millions of containers. Google and Twitter easily deploy millions of containers using Docker.
Yes, Docker supports macOS and Windows platforms in addition to Linux.
Docker makes it easy for developers to quickly create, deploy, and manage apps on any platform. Moreover, it optimizes resources at its core, allowing you to deploy more apps on the same hardware than with any of its counterparts.