Tuesday, 18 September 2012 08:44

Why data centres are becoming more complex Featured

By

Symantec has released the findings of a major new survey that shows that data centres are increasing complexity, mainly because of the demands of mobile computing, increasing virtualisation, and the move to cloud computing.

Symantec's has released the findings of its 2012 State of the Data Centre Survey. It highlights the underlying drivers of data centre complexity, current impacts on the business, and the latest initiatives IT is adopting to mitigate the issues.

The survey found that 79 percent of organisations report increasing complexity in the data centre. This stems from a variety of factors, and many organisations find they are having to implement an information governance strategy to address data centre growing pains.

The State of the Data Centre findings emphasise the importance of taking steps to intelligently manage organisational resources to rein in operational costs and control information growth. The survey was conducted by ReRez Research in March 2012, and are based on responses from 2,453 IT professionals at organisations in 34 countries. Respondents included senior IT staff focused on operations and tactical functions, as well as staff members focused on planning and IT management.

Nearly half (44 percent) of organisations cite mobile computing as a top driver of data centre complexity. "As today's businesses generate more information and introduce new technologies into the data centre, these changes can either act as a sail to catch the wind and accelerate growth, or an anchor holding organisations back," said Symantec’ Brian Dye.

"The difference is up to organisations, which can meet the challenges head on by implementing controls such as standardisation or establishing an information governance strategy to keep information from becoming a liability."

Organisations of all sizes, industries and regions report increasing complexity within the data centre. According to the survey, data centre complexity impacts all areas of computing, most notably security and infrastructure, as well as disaster recovery, storage and compliance.

Respondents rated complexity across all areas fairly evenly (6.6 or higher out of 10), with security topping the list at 7.1. The average level of complexity for companies around the world was 6.7. On average, organisations in the Americas rated complexity highest, at 7.8, and those in Asia-Pacific rated it lowest at 6.2.

The report says several factors are driving data centre complexity. First, respondents reported they are dealing with an increasing number of applications that they consider to be business-critical. Nearly two thirds (65 percent) said the number of business-critical applications is increasing or increasing greatly. Other key drivers of data centre complexity include the growth of strategic IT trends such as mobile computing (cited by 44 percent of respondents), server virtualisation (43 percent), and public cloud (41 percent).

The survey revealed that the effects of growing data centre complexity are far reaching. The most commonly mentioned impact is higher costs, with nearly half of the organisations citing it as an effect of complexity. Other impacts include reduced agility (39 percent), longer lead times for storage migration (39 percent) and provisioning storage (38 percent), security breaches (35 percent), and downtime (35 percent).

The typical organisation experienced an average of 16 data centre outages in the past 12 months, at a total cost of $5.1 million. The most common cause was systems failures, followed by human error, and natural disasters.

According to the survey, organisations are implementing several measures to reduce complexity, including training, standardization, centralisation, virtualisation, and increased budgets. In fact, 63 percent of respondents consider increasing their budget to be somewhat or extremely important to dealing with data centre complexity.

But the single biggest initiative organisations are undertaking is to implement a comprehensive information governance strategy, defined as a formal program that allows organisations to proactively classify, retain and discover information in order to reduce information risk, reduce the cost of managing information, establish retention policies and streamline their eDiscovery process. 90 percent of organisations are either discussing information governance, or have implemented trials or actual programs.

The biggest drivers for information governance include security (rated somewhat or extremely important by 75 percent of respondents), the availability of new technologies that make information governance easier (69 percent), increased data centre complexity (65 percent), data growth (65 percent), and regulatory and legal issues (61 and 56 percent, respectively).

Organisations have several goals with information governance, including enhanced security (considered important by 75 percent), ease of finding the right information in a timely manner (70 percent), reduced costs of information management (69 percent) and storage (68 percent), reduced legal and compliance risks (65 and 64 percent, respectively), and a move to the cloud (59 percent).

WEBINAR event: IT Alerting Best Practices 27 MAY 2PM AEST

LogicMonitor, the cloud-based IT infrastructure monitoring and intelligence platform, is hosting an online event at 2PM on May 27th aimed at educating IT administrators, managers and leaders about IT and network alerts.

This free webinar will share best practices for setting network alerts, negating alert fatigue, optimising an alerting strategy and proactive monitoring.

The event will start at 2pm AEST. Topics will include:

- Setting alert routing and thresholds

- Avoiding alert and email overload

- Learning from missed alerts

- Managing downtime effectively

The webinar will run for approximately one hour. Recordings will be made available to anyone who registers but cannot make the live event.

REGISTER HERE!

LAYER 1 ENCRYPTION A KEY TO CYBER-SECURITY SOLUTION

Security requirements such as confidentiality, integrity and authentication have become mandatory in most industries.

Data encryption methods previously used only by military and intelligence services have become common practice in all data transfer networks across all platforms, in all industries where information is sensitive and vital (financial and government institutions, critical infrastructure, data centres, and service providers).

Get the full details on Layer-1 encryption solutions straight from PacketLight’s optical networks experts.

This white paper titled, “When 1% of the Light Equals 100% of the Information” is a must read for anyone within the fiber optics, cybersecurity or related industry sectors.

To access click Download here.

DOWNLOAD!

Graeme Philipson

Graeme Philipson is senior associate editor at iTWire. He is one of Australia’s longest serving and most experienced IT journalists. He is author of the only definitive history of the Australian IT industry, ‘A Vision Splendid: The History of Australian Computing.’

He has been in the high tech industry for more than 30 years, most of that time as a market researcher, analyst and journalist. He was founding editor of MIS magazine, and is a former editor of Computerworld Australia. He was a research director for Gartner Asia Pacific and research manager for the Yankee Group Australia. He was a long time weekly IT columnist in The Age and The Sydney Morning Herald, and is a recipient of the Kester Award for lifetime achievement in IT journalism.

VENDOR NEWS & WEBINARS

REVIEWS

Recent Comments