BUSINESS TECHNOLOGY MEDIA COMPANY
Companies
Sectors
  • Home
  • /
  • Cloud Computing
  • /
  • Multicloud security doesn’t have to be complicated to be effective, it just has to be consistent

Multicloud security doesn’t have to be complicated to be effective, it just has to be consistent

By Kumar Vaibhav, Solution Architect at In2IT

Johannesburg, 07 Jan 2022
Read time 6min 00sec
Kumar Vaibhav, Solution Architect at In2IT.
Kumar Vaibhav, Solution Architect at In2IT.

As organisations in every industry shift infrastructure and services to the cloud by means of a multicloud strategy, their business assets, software and applications become distributed across several cloud-hosting environments. Despite the many business benefits – including agility, flexibility, competitive pricing, scalability and reliance, to list a few – there are several hurdles that must be addressed when adopting cloud across the business. It can be particularly tricky securing a plethora of clouds due to a lack of visibility across services and providers. With multiple clouds comes multiple layers of risk, such as an increased attack surface, improper user management, constantly shifting workloads, DevOps and automation, all of which can get complicated.

Multiple cloud benefits

However, cloud security shouldn’t be as complicated as it has become. Despite cloud having been around for more than a decade, there is still this perception that it is ‘new’ technology, which makes people uncomfortable. Cloud is many things, including scalable, reliable and cost-effective, but it’s no longer new. While on-premises security and own data centres is what most organisations think they need to secure their digital assets, the reality is that this is no longer sustainable – it’s time-consuming and cost-intensive to operate and manage, particularly in comparison to the cloud.

Security must meet in the middle

So how does cloud security compare to on-premises security? Essentially, there isn’t that much difference. It’s easy to think that on-premises is more secure because one has direct control over all the servers, systems and data living in that data centre. However, it’s important to remember when moving to the cloud that all cloud service providers, like Microsoft, Bing and Amazon, have their own security measures in place. The main concern that businesses have when it comes to moving data to the cloud is that they’re uncertain where it will live, but realistically, it’s possible to have the same controls in the cloud as with on-premises security. The two go hand-in-hand and security in the cloud is a responsibility that must be shared between the cloud service provider and the customer, depending on the service they’re using. The service provider has to ensure (in line with the SLA) that customer data is safe in their cloud, while the customer has to ensure everything in their cloud up to the point where it onramps to the service provider is secured and that their users are properly managed.

Users are the weakest link in security

Proper user management is particularly important now that the workforce is split between working at home, in the office or out in the field as 80% to 90% of all cyber breaches or attacks happen because of users. Whether it is users being tricked into giving out credentials, or credentials being compromised by exploiting vulnerabilities, the effect is the same, making it critical to implement and utilise multi-factor authentication (MFA) as part of a stringent identity management program. Password sniffing or spoofing is easy, and there are thousands of ways that attackers can gain unauthorised access to data, but having MFA drastically reduces the chances of getting defrauded from the inside. In addition to MFA, it’s necessary to have a proper access control programme in place. Role-based access is one of the most important keys to preventing data leaks. Here, it’s important that not everyone gets the same level of access, and specific users must be granted only the permissions necessary to fulfil their job description.

Countering the DevOps risk

Securing web-based applications to ensure they’re not used as attack vectors is as simple as proper testing. One of the main problems with the DevOps approach that’s becoming increasingly popular because of the agility it enables is that the fast pace of work can lead to an increase in coding mistakes, which can result in undetected bugs and errors. Attackers can exploit these coding mistakes to gain access to digital assets. To counter this risk, it is necessary to pay more attention to thorough vulnerability testing on the web app continuously while following best practices for maps. Although penetration testing can be expensive, this cost needs to be evaluated against the real possibility that a single breach can cause untold damage, both reputational and financial. Protecting against network threats and vulnerabilities in the cloud isn’t much different to securing web apps, and it’s important to ensure that all applications and operating systems are up to date in terms of security patches, along with proper access control through a firewall and a secure perimeter. Access must be on a needs basis only, and when vulnerabilities are detected, these must be addressed as soon as possible. In the case of virtual machines, it’s important to have the appropriate security controls and to pay particular attention to endpoint hygiene. There’s no point in having anti-virus protection, or a firewall if it’s incorrectly configured, malfunctioning or not reporting properly.

Visibility through simplification

Secure Access Service Edge (SASE), as defined by Gartner, can make a difference here. SASE is a security framework specifying that security and network connectivity technologies should come together in a single cloud-delivered platform to enable rapid, secure cloud transformation. In addition to providing a singular point through which services are delivered to the client, this also streamlines network access and security measures, while eliminating operational complexity by reducing the number of vendors involved and helping to protect the business from third-party vulnerability. This plays a massive role in achieving visibility and transparency in cloud environments, along with the fact that public cloud providers generally have their own compliance requirements to meet such as ISO 20 001, PCI, DSS and HIPAA – all of which can be passed onto the customer.

Secure the data wherever it goes

Ultimately, the most effective approach to securing anything in the cloud will be one that focuses on securing data both in transit and in motion. Asset protection is important, and visibility is critical given the scalability and flexibility of the cloud. Endpoint protection is required to secure servers or workstations or any machine in the cloud, along with operational security which ensures that when any changes are made, these occur without accidentally opening system loopholes. Monitoring is just as vital, along with vulnerability and penetration testing. Finally, to ensure security and continuity, businesses should avoid putting all their eggs into a single cloud basket. Using multiple clouds ensures that if one goes down, there’s another ready to take its place and ensure security through business continuity.

Editorial contacts
Evolution PR Lovejoy Shangase (011) 462 0628 lovejoy@evolutionpr.co.za
See also