Data strategy key to improved service delivery

The downturn has forced organisations to rethink their data centre purchasing decisions

Only two years ago, certain trends in data centres seemed inevitable. Many organisations were using virtualisation to improve the efficiency of their servers and, in the long term, it looked as if a mass move towards cloud computing was only a matter of time: it was cheaper, more energy-efficient and allowed organisations to scale easily.

Since then, however, we’ve been through a recession and organisations’ priorities have changed. A new report from analyst Gartner, Eight Critical Forces that Impact Your Data Strategy, says that between now and 2017, organisations need to consider the potential impact of eight forces (which include processor and system design architectural topology and capacity growth).

Failure to do so, it says, means that, “IT organisations will struggle to introduce new technologies and improve the quality of service delivery”.

Change is being driven by several factors, including the availability of new technologies, environmental pressures and economic restrictions.

The downturn has forced organisations to rethink their data centre purchasing decisions, says Rakesh Kumar, a Gartner vice-president and author of the report, pointing to a 16 percent drop in new server sales by volume in 2009 over 2008: “People have spent a lot less on hardware and have had to make do with older kit. That has had an impact on capacity planning, as they can plan for a slightly longer cycle. Although older hardware uses more energy, it’s often better to do that from a cost perspective than to incur the capital cost of new hardware.”

At the same time, there has been a dramatic shift in attitude towards hosted data centres, Kumar argues. Four or five years ago, a lack of floor space drove several city-based banks to use hosted services, but what seemed like a short-term decision then now looks like becoming long term.

According to Kumar, many organisations now question the need to invest tens of millions of pounds in their own data centres when third parties can do it at a low cost while offering strong security. At the same time, the line between a hosted model and a cloud model is blurring, with some hosts offering added services.

Factors governing decisions about which model to adopt can be complex, says Kumar: “The majority of our clients do not have a clear strategy – they are confused about the technologies and they are confused about the options.” A hosted model isn’t always preferable, for example: “There are many clients who have done a 10-year or 15-year cashflow model and realised that it’s cheaper for them to build.”

Organisations should make decisions based on an assessment of multiple considerations, including cost, efficiency and carbon disclosure regulations, he adds.
Kumar predicts that organisations will increasingly adopt hybrid models, keeping some applications in their own data centres, using a third party host for others, and putting others in the cloud.

Richard Godfrey, CEO of iPrinciples, which specialises in rich application development, and uses Microsoft’s Azure cloud, agrees that a hybrid model is increasingly attractive: “There are certain cases where if you have sensitive information, you might want to keep it under your control, and other things where it doesn’t make economical sense to do it yourself, so it lives in the cloud.

“My view is that there are some things you can have on-premise, and some things you can have in the cloud, and they should be able to inter-operate. You should be able to write your applications just once and then point them at the environment you want to run them against.”

The cloud model will become more popular, he argues, once cloud providers rethink their pricing model to allow users to pay according to CPU resources used, rather than by a flat fee for having an application in the cloud.