About
Subscribe

A new urgency?

A storage singularity? Maybe as early as this year.

Paul Furber
By Paul Furber, ITWeb contributor
Johannesburg, 15 Feb 2010

Storage issues haven't changed a great deal in the last 10 years. To be sure, disks have got faster and more intelligent and the price per megabyte has plummeted, but precious few organisations allocate their onto it correctly, manage it properly or upgrade appropriately.

As disks get cheaper, why not throw them at a temporary storage niggle? But that decision almost always turns out to be a bad one in the long-term. There's nothing so permanent in the IT world as a temporary fix, and playing "extend and pretend" can only last so long.

Will 2010 be the year when companies get a grip? Possibly. The economy is one goad. Tough times have a way of focusing minds: a better approach to storage is a good way of saving money. The problem is that we all generate in bad times as well as good.

"Technical staff at our customers have said to us that they've got a real challenge on their hands," says Sheldon Hand, country manager of Symantec. "The business has come to them and said that revenue is down 30%. Please can you explain why we need more money for storage? It's only during these times that people are actually forced to look at what they've spent and their utilisation rates."

In the past, when enterprise disk was expensive and hiding too many MP3s on the company server really did waste money, information life cycle management (ILM) was the mantra: an ILM system would make sure the different tiers of storage were used appropriately. Dave Funnell, storage principal at XIV Storage Systems, IBM, says, though, that there was a fundamental flaw with ILM.

"ILM was forced on customers by telling them they could save money by adding another tier of disk. Unfortunately, in practice what happened was the first place that data was assigned turned out to be the place it sat, whether it was an appropriate tier or not. Is it appropriate for every organisation today? No, sometimes it really is easier to buy more disk. But eventually we will have environments where the data resides on an appropriately performing disk without any intervention from the administrators at all. And they can concentrate on delivering what the business cares about, which is 'can I recover?', 'can I implement disaster recovery?' - not 'do I have 15 000rpm disks or not?'"

That remains a dream for now. "I'd like to know how many people have seen true ILM projects working," comments John Hope-Bailie, technical director at Demand Data. "As in projects where corporate-defined rules are shuffling unstructured data around and it's all working perfectly?”

But tiering may be making a comeback in a slightly different form, says Bertus van Heerden, GM of Business Connexion.

"Tiering and ILM were popular a few years ago when storage was quite expensive. It's become a lot cheaper. Auto-tiering is definitely where things are going and it is available in the enterprise space today.

“The one place we've seen archiving is in the backup environment. Once your data is archived properly, then it's backed up. We should take away the good ideas from ILM, but make sure that there's ease of management. The problem with throwing storage at a problem is that the terabyte or whatever you want to throw at it must be manageable."

Revenue is down 30%. Please can you explain why we need more money for storage?

Sheldon Hand, sales manager, Symantec

Mohammed Cassoojee, director of communications, media and utilities at Oracle SA, says Oracle has done a lot of work in this area.

"We've the process of information management based on policy so that information gets archived and tiered, depending on how important it is. We've also been working on compression so that data is compressed between four and 17 times."

Consulting the archives

The most important long-term storage principle - that data always outlasts hardware - is either unknown or not understood properly at many organisations. The lack of proper archiving, according to Commvault's business development director Brian Balfe, is because of the lack of proper infrastructure planning.

"People are struggling with where they need to go with it. It's become such a mundane tickbox item for IT management that it's fallen off the business radar. It comes back on the radar twice each year or so when something is lost and needs to be retrieved, but there's no real review about what is actually needed. There needs to be a decision about what is really important. Maybe that box of tapes covered in pigeon poo and layers of dust isn't as good as we thought it was. That conversation then moves into a data management discussion. But the media issue is never going to go away. The fundamental fact is that the data always needs to live longer than the hardware, but the infrastructure planning session never runs that way - and that's why we find ourselves in three years' time paying good money for specialists to come and do low-level reads on very tattered disks or tapes."

Data archival is full of the usual horror stories. IBM's Funnell says a large broadcaster in the UK did a time capsule exercise where it digitised some information onto a bunch of platters.

"Fast forward 10 years and when they took the platters out of the ground, they didn't have the ability to read the platters. The fact that the media itself might have been in good condition didn't matter - it was the ability to read it that mattered."

The other mistake made is not to distinguish between backup and archival solutions.

Says Petrus Human, technical director at Attix5: "You do need to make a distinction between backups and archiving and the methods of doing both. Backups can be kept online, but archives shouldn't. Backup media needs to be rotated; archive media needs to be long-lasting. Encryption of data is important but when it's archived, can you yourself retrieve the key in 10 years' time?"

For how long should data last? It depends on how critical it is, says Louis Botha, CTO of ThinkIT.

"For a small business it's easy to keep data for only three to four years. The financial industry needs to keep data much longer and the medical industry needs to keep X-rays for 40 to 50 years. Government and some insurance companies need to keep it even longer than that."

Encryption of data is important, but when it's archived, can you yourself retrieve the key in 10 years' time?

Petrus Human, technical director, Attix5

Many companies are not in the financial, medical or insurance industries, but that doesn't mean archiving won't soon be important to them, notes Balfe.

"There are two pieces of legislation that will have a huge impact on the South African landscape soon: the revision of the ECT Act and the Protection of Personal Information Bill. What effect will the new Bill have on e-discovery? Will storage and archiving solutions be appropriate for it? At a certain level in any organisation there are people who are very clear about what needs to be done. But they are almost never involved in the detailed implementation and I think as an industry, we don't take that seriously enough.”

But can it be read?

The core of the Protection of Personal Information Bill is, fairly obviously, personal information and how it's collected, stored and retrieved. The Bill, which was published in August last year, has a rather sweeping scope. Anyone - be they private individual, SME or large enterprise - who deals in someone else's personal information is addressed. But, quite apart from the legal implications of complying, the Bill is also forcing local companies to ask all sorts of questions about data formats. June Julyan, ASB-Mobius product manager at Bateleur Software, notes that managing data is really a software issue.

"A lot of the discussions about data should be identifying what data needs to be archived, what tools should be used and what format it's in."

Symantec's Hand agrees. "I think the format in which data is stored is as much an issue as the hardware used to read it and that's a software problem. It's a problem for the home user or the SME who wants to recover data, but it's in a proprietary format. It's something that needs to be considered. We talk about life cycle policies and storage tiers, but the format in which the data is stored is just as important."

Says Hope-Bailie: "The use of meta-data around the data is essential. You need to be able to know what data is important to your company. The intelligence you build into it is absolutely critical. It's no good just being able to store data economically if you can't find what you want when you need to."

And what of the cloud?

"The cloud is attractive as a concept," he says. "You throw your data at a vendor and he's got to make it readable and accessible for however long you choose. But the cloud is not without its problems. There have been some cloud failures. And if a director makes a decision to go with a cloud vendor and the vendor goes belly-up, does that director spend time in jail because of King III? What are the consequences? I don't think you can get rid of the risk by putting it on someone else's shoulders."

Cloud storage is starting to be disruptive in First World markets, particularly as it becomes easier to connect public cloud storage as if it was part of a company's internal infrastructure using a pay-as-you-go model. That is putting definite pressure on IT departments to restructure their storage to be more cloud-friendly and to make their own services price-competitive. But vendors also have to move with the times.

Auto-tiering is where it's going.

Bertus van Heerden, GM, Business Connexion

Chris Bamber, MD of Sysdba, says that because of cloud, storage vendors have to choose how they sell their wares.

"Some vendors are punting more utilisation, others are punting more efficiency. It's become less about selling disks and much more about services and solutions."

And that is no bad thing.

Share