Efficient storage an enabler

Read time 4min 00sec

The efficient storage of an organisation's data increases performance, saves on costs, and provides all the important backup and archiving needs.

"One of the most important factors that organisations should be considering when selecting a storage vendor is the level of support the specific vendor provides to its customers," says Frederik Strydom, senior systems engineer, SAS specialist at EMC, commenting on the results of the ITWeb/EMC 2015 Storage Survey which ran online for two weeks during December 2015.

A third of survey respondents cited 'performance' as their key factor when choosing a storage vendor, brand reputation came in second at 16%, while industry analysis ranked as least important to the respondents at 7%.

According to Strydom, there are a number of checks to bear in mind when choosing a storage vendor.

On the topic of level of support, he continued: "For instance, do their storage products have the ability to securely notify the vendor of any failures with the correct priority level and do they have a 24-hour support team to provide support where needed?"

Strydom believes that storage vendors must a keep a local spares store to ensure that there is stock of all major components readily available. They should also have the necessary expertise to troubleshoot and recover, if and when needed.

Strydom recommends that customers look at the sustainability of the vendor's business to ensure continued support.

"A good way to gauge this," he says, "is not only to look at their current revenue, but to also look at how much a company spends on research and development. If they spend money to innovate and develop, they will, in all likelihood, continue to be a force to be reckoned with in the future."

It emerged from the survey that 46% of respondents stated that when purchasing a Tier1 storage array, RAS (reliability, availability and serviceability) are the most important criteria to them. Predictable performance follows at 20%.

"RAS is obviously extremely important. Any downtime could lead to a company losing millions, damage their reputation or have dire consequences on the IT staff themselves," Strydom comments.

Providing the right level of predictable performance is only second to availability, he continues. Unpredictable levels of performance not only affect user experience, but also result in operational overheads for the storage administrators.

Software defined storage technologies set to grow

Furthermore it emerged from the survey that 67% of respondents are currently using, or will consider using, software defined storage (SDS) technologies to streamline their storage operational tasks.

"Software defined storage provides an abstraction layer that hides the complexities of the underlying storage hardware from the storage administrator. This not only reduces complexity for storage administrators, but also handles routine tasks like Fibre Channel zoning. It can even automatically set up replication based on the policy selected for the storage provisioned," Strydom advises.

He says going forward SDS will even allow migration between arrays by moving a LUN from one virtual array to another.

"You can only imagine the amount of effort and time saved on migration for the tech refresh of an array. Other benefits in the future might include moving LUNs between arrays based on the array utilisation and performance automatically, similar to DRS on VMWare."

It comes as no surprise that 65% of respondents said they expect up to 50% growth rate in their storage environment within the next 12 to 24 months. A very small percentage (2%) said they do not expect any storage growth whatsoever.

According to Strydom, there are quite a variety of solutions when it comes to trying to curb the cost of physical footprint of storage needs.

"One solution is to reduce the amount of physical space consumed by stored data. Thin provisioning, deduplication, compression and integrated copy management with array-based SNAPs all help to reduce the amount of capacity required to store data. Another solution is to increase array density and use lower cost drives," he advises.

"Lastly, don't forget ILM (information lifecycle management), which is the concept of moving data from primary high cost storage to lower cost storage as the importance and frequency of access reduces."

Strydom concludes by saying that with the continued increase in flash drive densities, he believes 2016 will be the year of all flash for primary storage.

Suzanne Franco
Surveys Editorial Project Manager at ITWeb.

Suzanne Franco is Surveys Editorial Project Manager at ITWeb.

Have your say
a few seconds ago
Be the first to comment