Subscribe

Realise mega advantages with microservices

When putting together all of the architecture concepts of microservices, containers and orchestration, the benefits are tremendous.
Pete Nel
By Pete Nel, Extreme Networks business unit manager, Duxbury Networking.
Johannesburg, 11 Dec 2020

In my last Industry Insight, I discussed microservices, containers and orchestration. In this article, I will discuss what happens when combining these concepts.

When putting all of these architecture concepts together, you end up with an application composed of many loosely-coupled services that interact via well-defined APIs, operating as agile containers, orchestrated with an intelligent engine on top. The benefits are tremendous.

The image below depicts these technologies in a conceptual cloud architecture in which the orchestration layer manages the build-out of services via containers, and the application also incorporates multiple databases of different types as well as serverless compute functions.

Scalability

By decoupling services into containers, it’s possible to gain a massive benefit in scalability. As a result, it can support large-scale deployments, while dynamically adjusting the application to right-size services based on unique customer deployments, network changes and data usage over time.

Management exists at the orchestration layer but is made possible by the service composition.

For example, if control plane functionality (terminating AP/switch connections, decrypting tunnels and packaging data into a queue/bus) is delivered as a dedicated service within the application, the control plane functionality can be scaled up and down based on the number of unique nodes needing control services.

Microservices provide an innate level of reliability because each service is maintained independently.

Meanwhile, if the API (or any other unrelated service) is very lightly loaded, it is possible to keep a lower resource API service, which scales up if demand grows. All of this can be managed automatically with monitoring as part of an orchestration flow.

Resilience

Microservices also provide an innate level of reliability because each service is maintained independently.

For example, perhaps there is a problem with a node providing the events service and it needs to restart for some reason. The problematic events container/service can be restarted individually without affecting the other nodes or services (authentication, admin login, licensing, control plane, API, reports and so on).

Compare that to a heavily integrated monolithic hardware or VM appliance and it likely does not work that way. It would normally be necessary to reboot the entire platform manually to fix one service.

Modularity

Again, the microservices architecture often leverages containers, which means the service software is very modular. Each service can use a software stack that is specific and optimised for the requirements of that unique service.

There is no issue if a software stack is different from the rest of the services in the application. If the software stack needs to change, then it is not necessary to overhaul the entire application and rewrite a bunch of features to upgrade or replace a software component.

API-driven

APIs are often discussed from a management automation perspective. In other words, public APIs can be used to build automations into management flow.

But in this case, it is the internal workings of an application with private APIs. Services rely on APIs to integrate together.

Conveniently, microservices allow each service’s API to be well-defined and specific to its function. This creates integration within the application only where necessary, and avoids the spaghetti effect.

Generations of cloud

In my last article, I talked about the evolution of cloud from on-premises hardware to virtual machines and eventually into the cloud, with an ongoing progression from cloud toward newer generations of architecture.

Many cloud systems in the networking industry have evolved (or are evolving) along this progression, where more advanced cloud systems will be leveraging all the benefits of cloud with new technologies.

The cloud generations can be summarised as follows:

First generation: Virtual machines (VMs) are installed and operated from a public cloud using infrastructure-as-a-service. Virtual machines are managed in a more manual way by a DevOps team in a single-tenant environment with some amount of continuous integration.

Second generation: The services offered by monolithic virtual machines are deconstructed into groups of services (as a service-oriented architecture or SOA) that remain as independent VMs. The DevOps team operates the VMs with orchestration tooling, supports multitenancy and utilises continuous delivery of new software.

Third generation: The SOA is replaced with a loosely-coupled microservices architecture, either in VMs or containers. Services scale with horizontal clustering and orchestration oversight. Data pipelines support more advanced analytics, and the DevOps team is leveraging continuous deployment and continuous operation.

Fourth generation: The architecture is completely containerised microservices, and some microservices may even be replaced by serverless computing. The data stack has evolved to include machine learning and artificial intelligence, and the application is cloud-agnostic so it can run in any environment. Finally, DevOps leverages a dynamic resource pool and provides a no-downtime operating environment.

Share