Subscribe

Boxed in

Linux containers provide more opportunities for IT growth.

Muggie van Staden
By Muggie van Staden, CEO, Obsidian Systems.
Johannesburg, 03 Jun 2015

When it comes to 'as a service' solutions, technologists have become almost blas'e around infrastructure, platforms and software. With business starting to readily adopt these solutions, now is the time to look into the impact that a different way of thinking can have in the organisation - using containers in the enterprise.

In the true sense of the word, containers refer to multiple isolated Linux systems running in a virtualised environment. These containers provide an isolated environment for applications. A big component of this is Docker, an open source project that automates the deployment of applications within these software containers. It provides an additional layer of abstraction and automation of operating system-level virtualisation on Linux.

Okay, so enough of the definitions. What does this actually mean?

In essence, Linux containers and Docker have the potential to radically change the way applications are developed, shipped and deployed in companies. These containers make it easy to package solutions along with their dependencies. This allows for a containerised application to work in different environments using a variety of platforms, whether it is a physical or virtual server, a public cloud, or even a network device.

All about the app

Application modernisation is changing the way IT is approached. Businesses are starting to understand the benefits of having their own enterprise applications to cater for the requirements of employees. As a result of employees being comfortable with using applications from a consumer perspective, they expect their employees to follow suit and give them apps that meet their business needs.

While this application-centric approach has been steadily taking place, the entire concept of development operations (DevOps) is changing to focus more on the concept of containers and what it enables business to achieve. Containerisation is creating an entirely new way of thinking about applications. By focusing on a more collaborative approach between software developers and other IT professionals, containerisation through DevOps creates a path for more cross-departmental work and less of a silo approach. This integration means a focus on core business deliverables and less on the mechanics of how it is done.

Local companies are starting seriously to look at what this will allow them to accomplish. For example, banks could build continuous delivery and integration, with toolsets running in the background. This creates an environment that moves beyond the theoretical and into the practical.

Containerisation is creating an entirely new way of thinking about applications.

A container-based architecture also means companies are able to move from monolithic stacks into applications that are focused on micro-services. But, while the concept might be a good one, it is unfortunately not as easy as just migrating existing applications into this environment. Quite a bit of work is involved on an infrastructural level. With IT departments already taking strain implementing other new technology concepts in the local workspace (think cloud, data analysis and so on), there might be some initial resistance to looking at containers.

Fear not

However, if the underlying infrastructure is built correctly, developers and engineers can ensure the environment is able to support containers. Solutions are already available that provide all the components necessary to easily package and run applications as containers. Thanks to these solutions, going the route of the container does not have to be prohibitively intimidating for companies.

Having people actively developing in these DevOps environments to deploy container-centric applications can now happen within minutes, thanks to solutions such as Project Atomic, from Red Hat. This sees developers able to move beyond the hype into something practically implementable.

Of course, with any change in development approach, the concerns will inevitably go towards the impact on security. As with other parts of the IT infrastructure, the security properties of containers need to be updated, often on demand. These updates should not impact the reliability of the containers and should be easily deployed into corporate systems.

Linux-based container infrastructure could very well herald the next big wave of virtualisation inside the organisation. By providing companies with more lightweight options than the traditional virtual machines, containers could give the flexibility required to be more dynamic in a competitive market. Thanks to containers presenting an alternative to operating system-level virtualisation, isolation between applications are increasingly easier to achieve than in the past.

Containers, therefore, create the potential to simplify IT operations. Using containers means there are fewer operating systems to manage and applications have greater mobility. This results in the more efficient use of existing resources. Idle containers do not take up computing, memory, or other resources. In terms of speed enhancements, containerised applications can boot and restart in seconds, compared to the minutes of virtual machines. In part, this is thanks to the smaller payloads of these applications, which also contribute to more efficient and cost-effective operations.

The business benefits of this container approach could just be too inviting to ignore by IT departments in SA.

Share