IT must be viewed as the "new innovator" in dynamic data centres. IT staff must move from just maintaining technology, to delivering business innovation that will help their company be successful.
For example, they must research, evaluate and acquire reliable, easy-to-implement storage solutions that actually improve their velocity to provide business services. Businesses today are operating at a speed that can result in incredible and sometimes unexpected storage demands on the dynamic data centre. Ways need to be found to blend various capabilities with ever-evolving architectures such as: cloud storage to deliver greater flexibility and responsiveness to changing business demands.
Big data is an increasingly big deal. As the world continues to generate ever-spiralling quantities of information, the challenge faced by businesses is how to take advantage of this new resource. Doing so depends not only on massively scalable storage infrastructure; it also depends on the kind of flexible and dependable computing platforms that only the mainframe can consistently deliver.
Worldwide, IT leaders who responded to CA's research indicated that 41% will maintain their hardware budget, while 23% expect a drop, and 36% will spend more. It's the same with services: 39% will stay the same, 18% will lose some resources and 44% will spend more.
In context, the broad-based increase comes in a generally bleak growth market for IT outlays. Research firm Gartner's Global IT Spending Forecast projected a 3% rise for 2012 over 2011, which in turn saw a 7.9% rise over 2010.Commenting in CA's research, Jon Toigo, managing principal of Toigo Partners International, in Tampa Bay, Florida, says: "We can assume that the staggering growth of real-time data and the need for massive processing power to collate and analyse all that information surely has a lot to do with this view.
"E-commerce alone is generating huge volumes of data, and larger corporations in particular need a blend of power, scalability and security to manage the incoming information. They need the best tools, and a mix of good technologies, including the mainframe, to stay competitive.
Here's a sense of just how big, big data is. Research firm IDC predicts the volume of digital content alone will grow to 2.7 zettabytes this year, up 48% from 2011. By way of reference, the first one terabyte commercial system didn't hit the market until 2010. A zettabyte is equal to one billion terabytes.
IT departments are further challenged by the fact that this data is presented in a dizzying variety of form factors, operating systems and applications, and an even broader range of data types. The vast majority of this information is unstructured, from videos and MP3 files to social media posts.
The data is surely rich in value, but it's hard to organise and analyse.
This data is presented in a dizzying variety of form factors, operating systems and applications.
How does the Big Iron mainframe solve the problem? In one customer project discussed at a recent conference, a mainframe was used to deploy 16 images into a virtual environment. It took one hour; a distributed environment required three days for the same task.
This isn't the only instance of new needs being met by old capabilities. Because of its continuing viability in the enterprise, the mainframe has become intrinsic to innovation.
For example, as Mainframe as a Mainstay of the Enterprise 2012 reveals, the largest form factor is keeping pace with the smallest. More than 36% of US companies, and 44% globally, have either already sanctioned mobile management of the mainframe or will do so in the next six months, while another 30%, both in the United States and worldwide, predict the same in six to 18 months.
Of course, there are still holdouts: 13% of US respondents and 9% internationally say they will "never" enable mobile mainframe management.
In the final Industry Insight in this series, I will look at the human factor and the future.
Our comments policy does not allow anonymous postings. Read the policy here