visit
What is containerisation and what’s so great about it anyway? You may work with container-based applications every day at work, but there comes a time when an intern or a fresh graduate new joiner (or perhaps a non-IT colleague) in your team ask you these questions and you find yourself stopped in your track thinking: “where should I start?”.
This article presents the two key points about containerisation you need to know by heart:1. A quicker deployment activity
As previously mentioned, traditional deployment requires a heavy work of dependencies installation on the host. The isolation of all the dependencies in a container means that engineers are released from the work of installing the dependencies in the host. For a container-based application, all the application’s dependencies are pre-defined as a code in yaml files and deployed together with the application. (Read also the following topics: , /, and )Additionally, traditional deployment might also require a full OS restart, even for a simple update. This is a downside, as it means stopping services on the host OS. With containerisation, a new creation of an (application) container does not require stopping any other services on the host OS. (Read also the following topics: , )2. More predictable deployment activity (and more control of deployment)
One other benefit that’s rarely talked about when talking about containerisation is that it provides a more predictable deployment activity. The aforementioned benefit of having a container-based application implies that engineers are released from the cumbersome nature of a traditional deployment activity. This activity normally creates unpredictability which takes a lot of engineers’ time. With containerisation, engineers can focus more on important works and can plan their deployment activity better.Moreover, containerisation also provides engineers more control of the deployment environment. Because containerisation requires a definition of the desired state of the deployment environment as a code (yaml files), it enables engineers to debug. Additionally, it also enables engineers to record and track the changes of the resource files (version control). Therefore, containerisation also contributes to higher IT engineers’ productivity.1. Multiple environment deployments
Because the application is packaged together with all its dependencies, it guarantees a consistent deployment, no matter where the environment is (and no matter what the host operating system is!). It means engineers can be more productive, as they spend less time on debugging and diagnosing differences in different environments. Engineers can also make assumptions in the Dev and Test environments that the same situation can be expected in the Production environment.2. Reusability
Containerisation also offers the ability to create templates for your deployment and resource files. This is highly beneficial because it means engineers can reuse the templates across different deployments, thus creating consistency of deployment.In conclusion, this article describes the two key points of containerisation you need to know by heart: isolation of process and consistency. These two key points are useful to remember, especially when you need to describe it to the layman. This is because the many benefits of containerisation can be derived from these two key points. The article also tries to highlight the benefits of containerisation that does not only imply technical advantage, but it also implies more efficient and productive engineers work.Finally, I hope this article can help you explain the next time someone asks “What is containerisation and what’s so great about it anyway?”.
Resources
1. 2.