Reduced to Tiers: what uptime standards don't reveal about a data centre
Don't mix up Tiers and Levels when it comes to assessing a new data centre partner
The official Tier classification of a Data Centre Tiering tells you everything you need to know about the reliability of a facility, right?
Wrong. The Tiering only guarantees a specified standard of the physical environment - the facility itself - not the data processing capability or the IT systems that reside within it. The Tier category is certainly an essential piece of the reliability jigsaw, but it doesn't give you the complete picture.
The Data Centre Tiering, as awarded by data centre research organisation the Uptime Institute, reveals nothing about data security at the facility, the quality of operational management, the risks of fire or flood… Anything, in fact, other than computing availability.
Tier classification is an indispensable tool for assessing data centre suitability in terms of downtime because it makes reliable promises about the impact of maintenance, unplanned outages, and so on. But that's as far as it goes.
To view even the highest category - Tier IV - as shorthand for overall data centre quality, is a mistake that could cost a business its life.
Tiers and levels: what's the difference?
It was the Telecommunications Industry Association (TIA) that first defined four levels of data centre, from the simple Level 1 facility through to the full Level 4 set-up. But the Uptime Institute put some robust metrics behind the categories, creating the dependable Tier Standard.
The higher the Tier, the greater the availability. A Tier I data centre is little more than a server room with no redundant - or back-up - components, but a Tier IV facility has physical copies of all essential equipment and other features that enable mission critical systems to operate uninterrupted when faced with equipment failures and primary power outages.
An official Uptime Tier classification offers a guarantee of data processing availability, as specified by the number. There is no such guarantee behind the term ‘level'.
It's an important distinction to keep in mind, especially if a potential supplier interchanges the terms. Be cynical: ask for proof of the Tier claimed.
Demand more than a Tier grade
Good suppliers should have additional accreditations and certificates offering assurances about all aspects of the data centre operation. Ask to see them and make sure they are up to date.
Due diligence is the way forward: preferably a full, independent health-check of the facility. This may be the only way to make sure that even a fully redundant set-up doesn't let you down.
There are many ways a data centre operation can be compromised:
- Cooling
Can the cooling system cope with the amount of electronic equipment in the data centre?
- Fire suppression
Is the data centre vulnerable to fire? Uptime Tier Standards don't say anything about the need for a fire suppression system.
- Smoke and leak detection
Both can wreak havoc with electronics. Does the data centre have early warning systems to detect, for example, a leak in the air conditioning?
- Flood
Is the location of the data centre a flood risk? Has anyone conducted a formal study?
All these risks should be considered and mitigated at the design stage, especially since many systems are extremely difficult to retrofit in a data centre. But, since the lack of some big capital items does not affect Uptime accreditation, a supplier may have decided not to incur the extra costs, leaving your data processing activity exposed to risk.
Take a close look at how the operation is managed
A data centre boasting the best technical infrastructure in the world can be compromised by poor management.
There are plenty of specialists out there who know exactly what to look for to evaluate operational quality, but here are a few things to consider:
- Is there a complete record of who is going in and out and for what reason?
- Does the centre keep detailed access logs?
- Is the CCTV coverage comprehensive, covering every sight line?
- Are doors regularly held open to let people - and dust - into the centre?
Management should have meticulous operational knowledge. The best, most secure data centres are dark places that hardly anyone visits - they just sit there, unlit, doing their job.
Unsurprisingly perhaps, Microsoft provides the object lesson. Not only are all racks locked and covered by CCTV so management can verify what any member of staff is doing at any time, but the centres are also designed to minimise the need for people to be in the same rooms as the servers. The positioning of external communications equipment ensures third-party engineers never encounter the servers.
In the public sector however, spending constraints and legacy issues mean it is often a different story. Inappropriate buildings may have been pressed into service as data centres, or servers may be housed in rooms also being used for other purposes. These centres may have Tier III or IV accreditation for data processing availability - but the operation itself may put critical systems at risk.
The more people who have access to a data centre, the greater the risks. Cleaners have on occasion, unplugged critical equipment because the only socket they can find for the vacuum is at the back of a rack.
Getting the data centre you want
Whether you are planning to deploy critical equipment into a data centre hosted by a third-party supplier, or intend to set up an internal data centre, the best practice rules are the same:
- Don't confuse the Tier standard with overall quality
- Don't take supplier claims (especially of their data centre's "level") at face value
- Do check and regularly review certification
- Do get an independent report on the quality of the operation.
There is no substitute for due diligence - and nobody wants to learn that the hard way.
David Cohen is a Principal Consultant at Mason Advisory