Subscribe

Storage strategies

Kirsten Doyle
By Kirsten Doyle, ITWeb contributor.
Johannesburg, 20 Mar 2019
Riccardo Fiorentino, CTO of EOH Technology Solutions.
Riccardo Fiorentino, CTO of EOH Technology Solutions.

Designing and planning a datacentre strategy in a world that's more digital and connected than ever before has become increasingly complicated. Organisations are spoiled for choice. Do they modernise an old facility, build a new one, lease, jump on the cloud train, use co-location, or any combination of these? In addition, today's IT landscape is dominated by cloud, edge computing, IoT, AI and other disruptive technologies, and the datacentre remains at the heart of the organisation. Its role is key to delivering IT services and providing storage and networking to an increasing number of networked devices, users, and business processes.

Derek Rule, client executive: Indirect at SUSE, says the explosion of data, as well as businesses embracing digital transformation, are all factors that play a part in not only storage strategies, but also the evolution of the datacentre. "There's been a shift in datacentre design because of the profound changes in how the technology is implemented and managed. Virtualised servers have replaced the physical, and the rise of public and private clouds means there are pools of resources waiting to be consumed. Moreover, with the rise of the hybrid cloud environment, compute, network and storage have been decoupled from the hardware and are controlled by the IT department, making the datacentre a strategic asset where rigidity and inefficiencies of hardware-defined infrastructure will be relics of the past."

"A business' choice of storage will have a critical impact on the safety, authenticity, privacy, accessibility and security of its data," says Peter French, MD, Synapsys. "Viewing storage as a one-size-fits-all solution in your business would be like thinking you could use the tyres from your everyday car on a racing car. In the same way that the F1 technicians would weigh up multiple options for a racing car's tyres, to best match track, weather conditions and suchlike, it's important to select the correct storage to best match your organisation's use cases. Different storage types have different 'best matches' and it's thus essential to identify the right technology and platform for the workload in question."

So how are businesses preparing their datacentre strategies for what lies ahead? Trent Odgers, cloud and hosting manager, SA at Veeam, believes the first step is understanding, from a technology perspective, where the business is, and which direction it wants to go. "A choice made today might evolve to something different, so ensure that wherever you go, it's easy to get in and easy to get out, giving you the freedom to experiment and refine multi-cloud solutions. The balance is around finding the right fit for your specific workload."

Odgers says once the business has complete visibility on its compute, storage, network, security and geographical requirements, as well as its focus, it can then make educated decisions on which strategy is the right one.

Eran Brown, Infinidat CTO for EMEA, says companies need to revert to their fundamental storage requirements, which include robust storage and data protection capabilities that are easily leveraged to address their biggest data management and data protection challenges. They also need to consider whether they simply require primary storage, business continuity, disaster recovery, analytics and management and what their security requirements are. Once this is established, the next logical step is to assess which environment, public cloud or on-premise, is best for the business. However, they need to understand the complexity and costs that go hand-in-hand with any cloud migration. The irony is that these businesses may not have to choose between a public cloud or on-premise as there are solutions that will enable data to sit adjacent to cloud, leveraging the best of all these options.

Impact on the business

Speaking of the ways storage impacts the business, Brown says previously, businesses only needed small data sets to be competitive. "However, as the entire market moves towards constantly improving customer experience (CX) by using algorithms and data harvesting, no organisation can afford to lag. As a result, companies need to start building well-designed, scalable data infrastructure that enables the business to achieve better CX while remaining cost-effective. Organisations need to leverage data in order to understand customer needs, improve CX and keep up with the competition."

Riccardo Fiorentino, CTO of EOH Technology Solutions, adds: "Internally, the business' employees are customers. In a digital landscape, those who use information to complete their day-to-day tasks would like the information they need and at the speed they're able to work. Both these needs are affected by the storage solutions that the business has put in place. Information on what most people view as traditional clients, or non-employees, can greatly enhance the customer's experience of the business. A great emphasis is placed on analytics these days, but analytics are only as good as the data. The value of the storage solutions in this process, mainly software in this case, is overlooked. Data or information is increasingly becoming a tradeable asset, both by corporate business and by cyber criminals. For this reason, governance, risk and compliance (GRC) for data is a real topic for any organisation."

As the local market progresses towards SDS, businesses should ideally have one storage pool, one storage budget, all on an unlimited scale.

Derek Rule SUSE

Henk Olivier, MD of Ozone Information Technology Distribution, adds that cloud storage solutions also impact on the business in terms of risk management, and GRC, particularly in light of the Protection of Personal Information (PoPI) Act in SA and the General Data Protection Regulation (GDPR) in the EU. "The primary reason why most companies aren't taking this as seriously as they should, especially in SA, is because there's no formal governing body that manages, controls and takes responsibility for these laws and their compliance. The guidelines are there for the business so they know what they have to be aware of in terms of GRC, but, ultimately, the business needs to ensure the data storage provider is compliant with the necessary regulations."

AI, big data

There's also the question of the role the datacentre plays in the use of big data and AI within an organisation. Olivier believes that any organisation that isn't paying attention to big data and AI is very likely going to be left behind. "This sounds like a clich'e, but these technologies allow for the organisation to more accurately map and predict future trends and to gain a huge advantage when it comes to analysis and data management. These technologies provide the business with essential tools to make faster decisions, but they need to be supplemented by data and data analytics."

Fiorentino adds that, again, these tools are only as good as the data they are given to work with. "The faster you can deliver appropriate data to these systems, the faster you can get the desired outcomes, and the more accurate the insights or decisions are. Good storage solutions that can filter out and deliver quality data with minimal to zero human interaction will greatly affect the ROI of big data and AI investments."

A business' choice of storage will have a critical impact on the safety, authenticity, privacy, accessibility and security of its data.

Peter French, Synapsys

Kevin Kemp, executive at PBT Group, says data should be viewed as a core asset that drives business evolution and growth. "Transforming into a data-rich business means that a focus should be placed on data capture processes to ensure data quality, for trends like AI to not only be reaped, but to provide enhanced outputs. Making use of data that is of poor quality will result in undesired or unusual AI outcomes. As such, data management plays a critical role to businesses being able to take up the benefits that AI has to offer. Effective data management is also going to become more critical as IoT architectures mature and become more integrated. These technologies are all centred around data and if the data is not seen as a priority and managed effectively, the results are bound to be ineffective."

IoT, AI and big data are no different than any previous data-hungry technologies that organisations adopt in that they enable organisations to reduce costs or accelerate revenue or identify and develop new business opportunities, says Brown. "However, organisations also need to leverage this data quickly to stay ahead of competition."

A need for speed

Fiorentino says in this pursuit of speed, organisations are turning to flash. "A system as a whole is only as fast as its slowest item, and the most lethargic item has always been the humble disk. Over the decades, we've seen interface changes - SCSI, SATA, SAS and similar - and differing spinning disk speeds. Then we moved to static disks such as SSDs and now flash. In terms of systems, flash and whatever technologies supersede it are really just stop gaps until in-memory computing becomes a reality. For big data and machine learning, the sizes and costs of flash are still prohibitive, but there's adoption in the local market through hybrid storage systems and hyperconverged solutions. However, organisations aren't replacing their spinning disk storage completely as the capacities just aren't there yet."

French agrees, adding that flash brings significantly improved performance with reduced cooling and power requirements. "Add to that the benefit of a reduced physical footprint and you have a solution that's faster, higher-density and greener. All-flash arrays (AFAs) typically have intelligent software that can optimise the hardware for specific workloads, as well as perform advanced functions on the data that's written to, stored, and served up on the platform."

Brown adds a caveat: "AFAs are costly and don't deliver better performance across all workloads. It's regarded as the 'silver bullet' for datacentres, but comes at a high premium. Intelligent software combined with commodity hardware can offer better performance, reliability and price without the use of AFAs. The increase of encryption will also put a strain on those who use AFA storage, because AFA is incapable of cost-effectively encrypting data, due to the need for more flash capacity when data is encrypted, resulting in an increase in the cost per gigabyte of data stored. Organisations in SA need an offering that supports both encrypted and unencrypted data."

Another important development is software-defined storage (SDS), but where does it fit in with traditional SANs or flash SANs? Rule says IT has traditionally worked out its storage requirements by looking at data growth levels and making a projection. "However, as the local market progresses towards SDS, businesses should ideally have one storage pool, one storage budget, all on an unlimited scale. With our current economic challenges, local businesses are tightening operational costs and are looking for ways to be leaner and more innovative, and this is where SDS and cloud solutions are changing the way they budget and how they tackle the big data challenges of today. SDS gives businesses true scalability and flexibility using commodity hardware, and delivers value by disaggregating the software and the hardware, allowing businesses to increase the flexibility in deployment and reducing the cost of the hardware. Also, because SDS is usually designed with scale in mind, it automates the basic maintenance of storage, decreasing the administrative overhead."

Olivier says there are pros and cons to SDS. "It has great scalability, improved performance and overall agility, and old and new hardware can be combined. However, businesses need to take care of another software level layer. Complexity can creep in if it comes at a larger scale. A single localised system is easier to manage, but if it comes to different locations and sites, it can be more complex. Certain SDS solutions are only compatible with certain types of hardware."

With hyperconverged infrastructure, Olivier says the pros are a simplified deployment process. "The hardware is performance-matched by the vendor to prevent any single hardware component from becoming a major performance bottleneck. However, although this sounds like a great idea, there's a problem with it - not all workloads are created equal. The biggest disadvantage is their inherent inflexibility."

Brown says SDS and hyperconverged are both excellent solutions for small to medium-sized enterprises looking for a 'single pane of glass' that's more manageable to their small IT teams. "However, with data reaching capacities of 100TB or above, the increased complexity driven by the various components involved and the challenge of troubleshooting this highly distributed environment creates a negative impact on the value to the business. At such scale, a system designed from the ground up to meet the needs of high-capacity, high-performance environments yields better results both from a technical and a financial perspective. Bundling the compute and storage into the same building block means that a customer will always have to procure by the highest common denominator. At scale, these additional expenses add up and make the total cost of ownership often unbearable to the business over time."

Share