No one-size-fits-all for enterprise storage
At the same time, many need help with data management because they can’t get a grip on the growing complexities brought on by remote working, an increasing flood of data, and a slew of new technologies, from 5G to IoT and the intelligent edge.
Hayden Sadler, country manager for South Africa at Infinidat, says the various types of storage platforms try to balance cost versus capabilities. However, there’s always a compromise on one or more of the key tenets of enterprise storage – performance, scale, availability and cost. “With storage arrays that are reliant on the media they use to deliver on their capabilities, it has been impossible to offer the highest performance, scalability and availability at an affordable price. The capabilities of an enterprise storage platform that’s defined by the architecture and storage media are intrinsically tied to the cost of the media.”
The different types of storage such as storage area network (SAN), network attached storage (NAS) and direct attached storage (DAS) and the storage media used, such as solid state disks (SSD), hard disk drive (HDD) and hybrid are therefore generally positioned for certain use cases.
For example, DAS, storage connected directly to a PC, may be positioned where cost is more important than scalability and/or high availability, Sadler says. “A small SAN array, using only SSD/flash media, may be positioned where performance is key, but scalability isn’t required – this keeps costs down due to expensive media. Ideally, an enterprise storage platform should deliver the highest levels of performance, scalability and availability. This, of course, at an affordable price.”
The days of CIOs tolerating people showing up at their datacentres to add additional capacity will be over.Hayden Sadler, Infinidat
In the world of data storage, there are three main approaches to storing your data, adds Daniel Teixeira, SE manager at Pure Storage – file, block, and object.
“Block storage breaks data up into separate pieces – fixed-sized blocks of data. It allows the underlying storage system to retrieve it no matter where it gets stored. Its benefits are that it’s efficient, easy to read and write and has low latency.
“File storage means data is stored and managed like a file within a folder. If you’ve ever accessed files on a hard drive, you’ve dealt with file storage. File storage systems provide readability and convenience for users. Additional benefits include easy-to-scale archives and data protection.
“In object storage, data is stored and managed as self-contained units called objects. Object storage is the format of choice for public cloud storage services like Amazon S3, and supports architecture used by most websites and Software-as-a-Service apps. Benefits include scalability, speedy analysis, API support, improved data integrity,” he says.
Swings and roundabouts
So what are the pitfalls of the different types of enterprise storage?
According to Modeen Malick, principal systems engineer at Commvault, DAS has become less common at an enterprise level due to the sheer volume of data that needs to be stored, and it’s also less flexible than other storage media. It can only be accessed by the device it is connected to. This limits its application, especially in today’s digital and connected world. NAS is similarly limited, because devices have to be connected to the network to access data. In a remote working world, this has serious productivity implications. SAN requires fibre channel connectivity, which is costly. The hardware used in SANs is also expensive, complex to deploy and requires specialist skills, which means ongoing maintenance and management costs. Another option, software-defined storage (SDS), comes with integration and interoperability challenges and using low-cost, low-performance hardware could mean low-performing storage. SDS systems are typically distributed systems, which can be challenging to maintain, and there’s often additional complexity as scale grows. When it comes to cloud, cost is often one of the main drivers; however, while the upfront cost may be less, the lifetime cost needs to be considered from a return-on-investment perspective. In addition, if applications are hosted locally, but data is stored in the cloud, there may be additional networking costs, and latency, and speed of data access is also a potential pitfall depending on bandwidth availability. There’s also data sovereignty concerns, compliance challenges, and the possibility of ‘noisy neighbours’, where one high traffic user might negatively impact the performance of another.
Containers and Kubernetes are the driving force behind how the industry is reinventing the way applications are built and run, driving enterprise IT efficiency.Daniel Teixeira, Pure Storage
Speaking of the trends affecting the enterprise storage arena, Sadler says despite an increased interest in migrating to the cloud, in reality, the cloud doesn’t work for every scenario. The issues mentioned above make the cloud ill-suited in many cases. “On the other hand, traditional storage procurement cycles can take months, and this lack of agility is detrimental. A ‘pay as you grow’ pricing model is economical and allows organisations to immediately start using capacity to accommodate storage requirements. This enables enterprises to leverage storage and resources that meet the needs of multiple workloads, in multiple locations. Relentless growth and demand for storage capacity, along with petabyte scale workloads and applications, is changing the face of storage requirements, both globally and in South Africa. What we’ve seen is a growing need for on-premise storage available on demand, to deliver capacity at scale, agility, speed to market, a better customer experience through real-time analytics and lower, more predictable costs.”
The big three
There are three major trends driving customer demand currently, adds Teixeira. The need for flexibility, the increased use of containers, and the rise in ransomware attacks.
Customers are increasingly demanding flexible consumption models, as the old model of technology consumption and deployment no longer works for many. They want the flexibility to scale capacity as required. They want a service model in an automated fashion, one where the complexity of day-to-day management is removed.
Containers and Kubernetes are the driving force behind how applications are built and run, driving enterprise IT efficiency. Enterprises are evolving their cloud strategies to be multicloud, and containers are also key to this. Containers make it easier to roll out cloud-based applications because they contain all the information needed to run them in manageable packages. For those running infrastructure across multiple environments, Kubernetes and containers can help to provide flexibility and enable users to easily migrate traditionally hard-to-move data between environments.
Finally, there’s the grim reality that ransomware attacks are increasing at unprecedented rates, and are no longer a question of ‘if’ but ‘when’. Effective infrastructure management plays a critical role in the mitigation and recovery from an attack. Awareness of what’s ‘normal’ in how infrastructure operates is essential. Without this, it could take weeks to see something ‘abnormal’ that flags data or systems might be compromised. Following an attack, enabling a fast recovery and return to normal business operations is vital. Organisations need valid, immutable backup copies of their data, which are protected and can’t be eradicated, modified or encrypted. This, coupled with the ability to restore data quickly and at scale is paramount.
Of course, there’s no discussion about enterprise storage without thinking about security and privacy. Malick believes that data security should always include basic security principles, such as strong passwords and up-to-date firmware. Security best practices should also be followed, for example, encrypting data both at rest and when it’s being moved. It’s also important to implement multi-factor authentication protocols and have an effective backup and disaster recovery strategy. Data governance becomes crucial, especially in light of the growing body of data privacy legislation. Also, as cloud becomes mainstream, data protection and availability are critical. It’s important to include role-based security and encryption at 256 bit or higher. Multi-level authentication is essential too, as well as an immutable copy of data. We’re also seeing relatively new security techniques such as air gapping, which protects data by creating a separate network with no connectivity to public networks.
When it comes to privacy, Sadler says looking at the broader picture, including securing data further up the stack from the storage platform, data at rest encryption may not be sufficient when used alone. This is because when data leaves the storage media and is migrated to another layer, data at rest encryption no longer protects it.
Multicloud strategies will become the de-facto standard, allowing flexibility to support a variety of use cases, business units, and development groups.Modeen Malick, Commvault
“The biggest and most severe data breaches that have affected both the public and private sectors all operate at the application layer. This includes almost all versions of both malware and advanced persistent threat attacks. Because of this, encrypting at the application layer is the best form of encryption that will address these serious threats. However, enterprise storage platforms that rely on data reduction ratios to provide an acceptable commercial model may no longer be affordable when deployed in an environment that leverages application-level encryption. This is because storage-based deduplication and compression generally cannot occur on data that’s already encrypted at the application layer. Ideally, an enterprise storage platform positioned in these environments, should be able to deliver an acceptable commercial model without relying on data reduction ratios for the data storage capacity required by customers,” he says.