About
Subscribe

Getting smart with enterprise storage

Over the past couple of years, enterprise data storage needs and requirements have grown at a tremendous rate, even during periods when the economy was sluggish.
By Kaunda Chama, ITWeb features editor
Johannesburg, 07 Feb 2005

Demand for storage resources continues to grow apace, as does the need for the they hold to be available to multiple applications, from any location, immediately and continuously. With the emphasis firmly on always-on access, especially through customer relationship management (CRM) and enterprise resource planning (ERP) systems, high-availability storage networks have become a significant growth area.

IT budgets are still under heavy pressure, so the question is not so much how to create a storage infrastructure, but how to create an affordable one. IT decision-makers looking to develop future storage strategies are having to assess their current storage capabilities to find appropriate old technologies to fill gaps - along with selecting new vendors and solution providers.

Cindy Rossouw, Comztek storage expert, says her company sees data recovery as one of the biggest drivers for the growth and improvement in enterprise storage strategies and architectures.

She comments that many companies are taking the write-once-read-many approach to backup and storage, and will for some time continue to write to disk, then backup to slower disk, before going to tape.

"In some cases you find end-users are being over- or under-sold storage. The SMB [small and medium business] market seems oblivious to this, whereas the financial services sector seems better informed about its needs. Basically, sound storage is no longer a nice-to-have, it`s a must-have," she says.

Sheldon Hand, senior systems engineer at Veritas, has noted a drive towards systems consolidation. "Companies are fast moving away from having storage silos and are now consolidating their systems, servers and storage. If you always want data at the right place and at the right time, you have to integrate your systems," he says.

The past three years have seen much technology centralisation and consolidation, and storage consolidation has been at the forefront of all this. It has, in turn, been a great enabler for on-demand computing.

Vendors such as Veritas, HP and IBM have been developing new applications aimed at smoothing corporate operations in environments where storage is consolidated.

Says Hand: "Nowadays companies have terabytes of self-generated data which, if properly stored and later mined using tools, can be used to gain competitive advantage in their respective sectors."

The mail trail

E-mail has also been identified as a major driver for increased demand for storage capacity, due to it now being one of the most commonly used forms of correspondence. The fact that e-mail is considered legal documentation in some countries means it has to be kept for longer periods; and this, coupled with richer data formats, contributes to larger file sizes.

The storage of data for business intelligence, CRM and ERP purposes is another major contributor. And although issues around the retention of data have not been finalised in SA, a lot of companies are gearing up for a time when they will need to do more backing up than they have thus far.

Tim Knowles, CEO of Storage Technology Services (StorTech), notes that information lifecycle management is definitely a major contributor to the way corporates are purchasing and managing storage resources. Fanie van Rensburg, MD of Shoden Data Systems, shares this view, noting that business continuity and data availability have become paramount in today`s competitive business environment.

"Initially, the post-Y2K period saw firms just adding on to existing storage resources as the need arose, but as needs continued to grow exponentially, companies soon realised that the practice would see them run out of money fast," he says.

Companies have also realised they need to balance their backup and storage technology between expensive and more affordable alternatives.

"It is also important for companies to take a holistic view of information from a senior level, because it is actually a director`s judicial duty to ensure a company`s data is securely stored and that it maintains its integrity," adds Knowles.

However, many companies continue to view storage and backup infrastructure as a grudge purchase because they do not fully understand the rationale behind the legalities.

Basically, sound storage is no longer a nice-to-have, it`s a must-have.

Cindy Rossouw, storage expert, Comztek

Meanwhile, management tools are coming of age and now allow users to better manage available storage resources in their corporate environments.

Van Rensburg comments that even with the emergence of cheaper forms of disk, tape will still be around for years to come because, on high volumes, it is still king in terms of keeping costs down. "Currently, tape-to-tape backups are much quicker and more reliable," he says.

Frans Nijeboer, national practice manager for data centre storage solutions at Dimension Data, points out that new technologies are becoming increasingly storage-intensive. He notes that although some companies are putting storage facilities on their networks, a lot are still sitting with information silos.

Human error

According to a Meta Group study, entitled Information Lifecycle Management and Enterprise Content Management: The Confluence of Technology and Business - Through 2004-06, IT organisations will increasingly connect the technical elements of storage management with the business requirements of content management.

The report says enterprise content management (ECM) is emerging as a critical infrastructure requirement, and storage continues to grow in importance as it increases in capacity, functionality and management. They come together in the nascent yet compelling vision of information lifecycle management (ILM).

For an ILM project to be effective, it must automate the process of managing information. Automation should yield lower risk by minimising the opportunity for human error or interference (eg more accountability) as well as creating higher personnel efficiency, both within and outside the IT group. Automation should also optimise the storage tier and manage the movement, replication and other aspects of storage infrastructure.

To implement ILM effectively, the storage infrastructure must have knowledge of the informational content, and vice versa. To illustrate, one should consider that, prior to computer automation, the document owner determined whether the document should be placed in a desk-side file (perhaps with the security of lock and key), in a publicly accessible file, in long-term storage, or whether it should be destroyed. These determinations were made based on the document`s content and its importance to the business. Of course, others might have made different judgments, and access was time-consuming and resource-intensive. But computers did not solve these issues; they simply moved them about.

Tape will be around for years because, on high volumes, it is still king in terms of keeping costs down.

Fanie van Rensburg, MD, Shoden Data Systems

Meta says that from the storage perspective, information was moved from disk to tape to an off-site vault based almost exclusively on a single criterion, the age of the information, without regard to the value of that information to the business. However, current business and regulatory environments require more sophisticated treatment of the information, and new technologies are evolving to meet this need.

Content, not age, is a more valid determinant of data/item disposition. For example, ITOs cannot simply purge e-mail files after a certain period of time. 'Junk` e-mails, of course, can be deleted, but those pertaining to business transactions, human resources, finance and the like might need to be retained, catalogued, stored, accessed and protected for extended (often specific) periods of time.

Gaining access

Moreover, opines the study, it is not as simple as moving the data to off-site storage, because the records must be accessible within a reasonable period of time and cost. The result is the integration of specialised storage systems with content management and e-mail archiving systems (a subset of ECM, like EMC EmailXtender, IXOS-eCON-server).

<B>ILM: The sum is greater than the parts</B>

Meta trend: Storage management automation, standards and process will remain immature through 2006.
Net annual storage growth will average 35% to 40% for enterprise (monolithic), 45% to 50% for midrange (modular), and 80% to 85% for capacity-based (SATA/ATA). Like-for-like price/capacity will improve 35% a year. Through 2007, storage hardware will be rendered a tiered commodity by software-based information lifecycle functionality (eg storage resource management, data protection/recoverability, integration, data movement and interoperability), which will become the primary enterprise storage differentiators.
Meta says the following three phenomena have combined to create the "perfect storm" for information management:
* Ubiquitous and nearly impossible-to-control growth and access to enterprise and World Wide Web information.
* Security, compliance and data integrity, driven by requirements for financial transparency and new regulatory schemes such as the Sarbanes-Oxley, HIPAA and various US and European regulatory agencies.
* Information and disaster recovery heightened by the 11 September 2001 terrorist attacks and August 2003 power-grid blackouts.
Taken together, these matters, once the sole domain of IT, are now of direct and immediate importance to lines of business and the entire organisation.
The research group defines information (or data) lifecycle management (ILM) as follows: "ILM is the process by which information is moved through a continuum of storage media to ensure business-required service-level delivery at the lowest unit cost, based on the content of the data element. ILM also includes progressively maturing and automating storage management processes that result in year-on-year personnel efficiency improvement, all without sacrificing rapid response to changing business requirements."

Meta says, to date, storage has been managed on a two-tier model: disk and tape. However, with rapidly improving storage management technology, IT organisations have a much broader range of infrastructure deployment options that enable them to better match business requirements (data access, retention and security) with infrastructure choices (enterprise storage, midrange storage, ATA/SATA or tape). The group believes the industry is on the cusp of robust heterogeneous storage management capabilities that will enable ITOs to select best-of-breed storage systems and management.

"ILM is not a silver bullet for managing data or storage, but a holistic programme that begins with process refinement and rule definitions then drives toward automation of those processes using repeatable methodologies," says the research firm.

Meta adds that fully implementing ILM is an extensive project, one that could take a year or more to fully exploit. Yet, despite the effort, a properly implemented ILM project has numerous significant benefits. These include greater organisational agility, reduced risk in many areas and an optimised storage cost structure. However, ILM need not be a daunting "big bang" project and can be implemented incrementally, starting with the one or two applications that have significant impact on the organisation, and expanding as the following benefits are realised:

* Organisational agility: ILM facilitates organisational agility primarily by establishing a set of processes that are routinely re-examined and refined. Because these processes are documented and repeatable, they become bigger than any individual person and as much a part of the infrastructure as any hardware or software componentry. By linking these process refinements to the business unit, the chances of a "blindside" event are reduced. Blindside events invariably cost the organisation unduly, because emergency purchases are made at high cost, existing investments are sidelined and current projects are shelved in mid-process.

* Reduced risk: An automated, procedural solution to compliance ensures the organisation will not revert to old habits as the sense of urgency fades. Furthermore, an automated solution ensures continuity and compliance even as personnel inevitably turn over. Finally, an automated solution has checks and balances and can be made tamper-proof from both internal and external attacks while improving the organisation`s legal credibility.

* Lower storage costs: ILM can optimise the deployment of assets, resulting in better asset utilisation, improved access to information and lower cost per unit stored. ILM enables ITOs to deploy storage tiers (like premium storage for critical applications, entry-level storage for routine applications). The different tiers of storage available to an organisation carry distinct and relative cost differences that will have varying net financial impact (due to the differing degrees that a group will leverage each tier). However, moving from a 'one-size-fits-all` approach will carry tangible, measurable, financial benefits.

Essentially information lifecycle management is a concept that goes well beyond storage media selection. ILM meshes process refinement, data/content management and infrastructure management into a holistic strategy where the sum is greater than the parts. Properly implemented, ILM can yield not only intangibles such as greater organisational efficiency and agility, but also hard monetary savings as well.

New challenges for storage management

Advances in technology are assisting users to improve data protection - and meet new corporate governance requirements.

Compiled by Kaunda Chama

The concepts of data storage and archiving are influenced by a set of fast-moving technologies, which continuously incorporate new developments and approaches to the challenges associated with the management of corporate data.

This is the view of John Hope-Bailie, technical director of Source Consulting. He says some examples of new technologies to appear are disk-to-disk (D2D) and disk-to-disk-to-tape (D2D2T) backup, as well as 'tiered storage` which forms the basis for the introduction of a new generation of information lifecycle management (ILM) practices and disciplines.

"For today`s executives, be they technically-oriented or not, knowledge of these and many other developments is critical, as they will have an impact on the future architectures of their corporate-wide computer systems - and the budgets required to support them," he says.

Without doubt, planning is a key issue. For example, organisations today should be placing more emphasis on disaster prevention instead of disaster recovery, by using technologies such as replicated remote storage and Ethernet-based storage networks.

"Today, questions are being asked about the most effective and legally compliant data storage options. This is also true for larger companies with many branch office locations," he says.

Local scene

Locally, remote storage has become prevalent because of the difficulties associated with moving large amounts of data to a centralised location over slow, often costly, wide area network infrastructures.

However, as legislative requirements have highlighted, data held at remote sites needs to be managed, secured and archived in exactly the same way as data centre information.

Says Hope-Bailie: "This is difficult to achieve at branch office level, where backup tape drives and other equipment are often unsecured and the skills needed to manage effective backup and storage procedures are in short supply."

Fortunately, technologies are becoming available which will improve this situation tremendously. By optimising the use of the WAN link, shortcomings in terms of bandwidth cost may be overcome.

Distributed data strategy

Amit Parbhucharan, technology marketing director at Channel Data, says IT departments have to keep pace with current trends and store, archive and secure data on an enterprise-wide scale. In terms of centrally stored data, most are succeeding. However, the potential disasters are to be found on the client side.

Unlike central enterprise systems - such as CRM and ERP software - client-side data is often unstructured and unmanaged. Moreover, according to research groups, more than 60% of client-side data is not adequately backed up at all as it resides on PCs.

Budgets also remain significant stumbling blocks. Many organisations simply lack the resources to implement a universal client backup policy. But, at odds with the cost-cutting efforts of most companies, is legislation.

Abiding by the law

In the US, in the wake of the Enron and MCI WorldCom corporate accounting scandals, has come the Sarbanes-Oxley Act, crafted to prevent high-profile corporate malfeasance.

In SA we have the second King Committee Report (King II) and the JSE Securities Exchange Listing Requirements Guidelines, which propose that company directors, collectively and individually, accept full responsibility for the accuracy of corporate information and reporting.

Furthermore, the Electronic Communications and Transactions Act requires that companies have a clear document management policy, a legal e-mail disclaimer notice, a sound IT security policy and, importantly, a fail-safe storage and backup system.

Until recently, backing up client data has been both complex and unsatisfactory. Desktop PCs on the internal corporate network typically enjoy fast network connections (of the order of 100Mbps), but the aggregate data load from thousands of client PCs can be extreme.

Even best-case broadband links operate at data rates below 1MBps, making them unsuitable for multi-megabyte transfers. The growing ranks of mobile PCs pose a threat to any effective, client-side backup routine.

So, ironically, those mobile PC users whose data is most at risk of loss are the ones least likely to backup.

The answer?

One trend in the US is implementation of intelligent, scalable and automated PC backup, archiving and consolidation solutions.

In a typical scenario, "smart" agent software on the client analyses and compares disk contents with data already backed up on the central servers. This process performs low-level comparisons of data at the client and on the server, and uploads only those individual blocks that have changed in each file since the last backup.

Today, thanks to ongoing advances in technology from independent service vendors such as Connected Corporation, capabilities such as disk scanning, content analysis, encryption and compression have been made available to assist users to improve data protection - and help meet corporate governance requirements.

* Article first published on brainstorm.itweb.co.za

Share