To get the most out of the cloud, you need the right applications. Here’s what to look for.

How to determine a genuine cloud native application

The definitions of cloud native application are varied and often confusing. Many so-called cloud native applications don’t take full advantage of cloud infrastructure – but how do you recognise the good from the bad?

For example, some definitions focus on the technology architecture of the application: Is the architecture containerised, built with microservices and/or designed with DevOps in mind?

The reality is these technologies may or may not be cloud native applications: this article outlines the required attributes (in my informed though biased view) which truly apply.

Divide services to manage them

The first thing a great cloud native application needs to be is modular and ‘loosely coupled’. This is the part of the definition people describe with reference to microservices. Essentially, this breaks up functions of an application into separate smaller services that each address a single need.

For example, an application to manage the books in a library might have a service for recording loans to library users; another tracking books brought in from other libraries; and a third for reporting on which books are currently available. This allows each part of the application to be developed separately and optimised independently.

In turn this also allows the application to become loosely coupled. As a result, these services no longer have to be written by the same people, or in the same programming language, or even deployed on the same infrastructure.

Cloud Native Application Characteristics
Figure 1: Cloud Native Application Characteristics

Conscious decoupling

Champions of containerised technology tell us cloud native applications need to have their code and their data decoupled from one another. Specifically, they’ll talk about separating the ‘stateful’ parts of an application into a ‘persistence layer’ – essentially treating an application’s code and its data separately.

There are good technical reasons to do this because data is slow to move around. Making a full copy of one hundred terabytes of data takes a long time, especially when compared to code-related tasks such as adding a new instance of a container to a load-balanced pool – a process that can take a few seconds.

Just as importantly, the techniques and requirements for protecting and maintaining data are different to those for protecting and maintaining code. For example, in New Zealand the Public Records Act mandates that organisations keep certain data for seven years – but doesn’t provide any guidelines about the code that was used to process that data.

All of this means it makes sense to treat data and code separately: separation allows the code and data to be managed, largely independently, by people skilled in one or the other. 

Reduce and manage cloud complexity by deploying ‘flat’ networks

The separation of services and data in application design brings with it the added complexity of needing networking elements to bring these pieces together when they need to talk.

Cloud native applications are characterised by being deployed on “flat” networks. A flat network does not have internal security boundaries or dividers.

Older architectures often separated parts of the application into separate networks to assist with network performance optimisation and provide flexibility when it came to security. The downside of this approach is that the business logic and data flows were often controlled by network engineers rather than developers and data engineers. Any requirement for cross-functional coordination to diagnose and manage problems tended to cause confusion, even when the problems were ultimately solved.

Cloud native architectures by contrast, largely push security and data flow functions into the application itself. This removes the dependency to have advanced networking skills in the application teams and allows the networks to focus solely on moving data as fast and efficiently as possible.

The loose coupling of application components brings with it one key benefit: cloud native applications do not need to be created with the underlying infrastructure in mind.

This allows cloud native applications to be broken up into chunks the size of which a single person or team has the skill set to understand and manage. The people maintaining the compute, storage and network hardware do not need to know how the applications work in order to provide a platform that can run those applications efficiently. At the same time, the application engineers do not need to have hardware optimisation skills to maximise performance from the underlying platform.

‘Connecting’ all the pieces of an application together

While it is one thing to break an application into manageable chunks, it is quite another to reconnect these chunks without losing benefits we have realised or the overall performance of the application.

Application programming interfaces (APIs) are the best way to connect the pieces of an application together in a cloud native way. These define the required inputs and the format of any outputs of a part of an application. More broadly, APIs define the expectations for how a component will behave when called upon.

It also means all a development team needs to know about the services and resources they want to make use of are contained in the APIs that are available. The internal details of how the service works, which technologies it uses, and how those technologies are implemented are entirely irrelevant to making use of the resource – as long as its API responds correctly.

In this way APIs provide exactly the information required for disparate teams to be able to quickly and effectively interoperate, without becoming bogged down with unnecessary detail.

Make the application easy-to-see

The true power of APIs are that they facilitate another attribute of the cloud native application: scalability.

The economic benefits of using cloud services are largely realised by being able to provision and scale resources to match changing workloads.

In on-premises environments, there is rarely any benefit from powering off resources that are not being fully utilised, and the slow speed of provisioning often means that hardware needs to be over-provisioned and well in advance of any spike in workload demand.

Cloud architectures allow resources to be provisioned or relinquished in seconds. Cloud native applications are modular, simplifying the resource management for each component so these can be optimised by cost and performance. Equally the cloud utility billing models provide significant economic advantages by closely matching resources to workloads in near real time.

Realising these economic advantages requires the provisioning of resources to be loosely coupled to the applications. Decisions about how to resource an application should not require detailed knowledge of the inner workings of that application, merely an understanding that certain thresholds represent a need for additional resource or are an indication that excess resource could be removed.

This requires applications which can detect and report on their own requirements and can alert other components of the platform when internal performance bottlenecks are likely to arise.

Alongside an API for getting work done, a cloud native application must be able to record and publish key metrics about its own state and performance. Platform orchestration needs this information to be able to operate efficiently.

Applications lend themselves to automation

The modularity and visibility of key metrics within applications can lead nicely on to supporting automation.

Automation of application and resource provisioning is critical to being able to scale an application either up or down with any sort of reasonable speed or efficiency. It is the cornerstone of realising the benefits that every cloud project lists as key goals: application resiliency, scalability, efficiency and financial economy.

Final words

‘Cloud native’ is more than just a definition of what an application should look like. While this is not a complete definition of what a cloud native application is, applications that are modular, loosely coupled, based on APIs, make key metrics visible, and are provisioned automatically make the best candidates for such applications.

Cloud native is a mindset for building applications which break us out of the annual product release cycle and allows applications to be agile and elastic and make continuous, incremental improvements.


Sign up to the Primer newsletter

Keep your finger on the pulse with monthly news, insights and tips on technology, innovation and transformation