Cray: Big companies are turning to supercomputers to tackle their big data challenges
Supercomputer maker Cray sees growth in demand from companies trying to tackle the big data deluge
Supercomputer maker Cray says that it is seeing an increasing in interest in its ultra-high-end machines from businesses looking to find a way to tackle their fast-increasing big data challenges. And the interest in using supercomputers comes from a range of sectors.
Supercomputers have historically been limited to governments and higher education institutions due to their immense cost, size and power requirements.
Cray, one of the biggest supercomputer manufacturers, claims that commercial customer sales accounted for 15 per cent of the company's revenue of $724.7m in 2015, but expects to turn this into one-third of all revenue in the near future.
One customer from 2015, Petroleum Geo-Services, installed the fastest clocked supercomputer in the commercial sector. The machine ranked 14 on the supercomputer Top 500 list, higher than many used by government agencies.
Barry Bolding, chief strategy officer at Cray, told Computing that this market growth is due entirely to the big data trend.
"There has been an explosion of data that is really affecting our customers. They are collecting orders of magnitude more data and that's driving a demand for our systems," he said.
Notable sectors where Cray is now seeing such demand are oil and gas, aeronautics, manufacturing and financial services.
But it's not just traditional big business using supercomputers. Cray even has a Major League Baseball team on its books that uses a supercomputer to analyse player performance and decide who should play in certain matches.
Bolding explained that such companies would have used server clusters for their data modelling and simulations in the past, but the amount of data now being collected makes this unfeasible.
"In the commercial space the first customers we had were those who have done simulation modelling before but on cheap, white box cluster servers," he said.
"However, because of the massive amount of data now being collected they are moving away from these cluster servers because they don't have the scale to tackle the problems and data loads now being generated."
Cost concerns
This may be true but any business leader considering pitching a supercomputer purchase to the board probably has visions of being laughed out of the room, as entry-level systems start at $250,000 to $500,000.
However, Bolding claimed that buying clusters of servers from the likes of Dell, IBM and Lenovo to meet the data demand would almost end up being the same price.
"Supercomputers do have a higher average starting price, but when you get to a certain size or scale you're already into the same sort of price as an entry-level Cray system," he said.
Bolding also dismissed the cloud as a viable way to crunch such huge data loads. "The cloud is great if you're doing bursting and only use it once a month or so, but most people using our systems use them all the time and for volumes of data that you just can't do in the cloud," he explained.
Cray would say all this, of course, as the firm is trying to sell more supercomputers.
Taking a more sober view is Brian Hopkins, an analyst with Forrester, who told Computing that Cray may have a point in saying that some firms will embrace supercomputers for their big data needs, but that the market might not scale as far as it hopes.
"I think it is still a very narrow slice of the demand focused on the most massive processing needs, like life sciences, healthcare, energy, etc. Our data says that 65 per cent of firms will try to build a data hub or lake by the end of 2016, and we predict that half of those will fail," he said.
"This means that a small fraction will probably recognise that they need more computing horsepower and turn to supercomputers like Cray's."
Furthermore, Hopkins noted that price may still be an issue. "I think Cray's pricing model will be under stress," he said.
IDC analyst Eckhardt Fisher also suggested that, despite Bolding's claims, there is still server cluster kit that could meet the needs of firms in the big data arena, such as HP's Superdome X line.
Superanalytics
However, Cray is not just looking to sell hardware as part of its focus on pushing more supercomputers into the commercial sector, but is increasingly working to provide the necessary analytics tools to meet organisations' big data demands.
Bolding believes this is key to Cray's future. "If we look forward five years I do not believe a supercomputing company can be separate from a superanalytics company. They will have to be the same thing," he said.
"We think this will grow at a very fast pace and our strategy is to position ourselves right in the middle of the analytics and supercomputer markets and provide all the tools this requires."
An example of this is software that Cray developed in conjunction with a security sector partner that can be used to spot and predict cyber threats.
The security company in question has not yet been named, but Bolding confirmed that it will be in the near future.
It perhaps remains to be seen whether supercomputers really will become a mainstay of commercial organisations. But the unquestionable increase in data being gathered, and the fact that Cray can point to several major commercial customers, such as oil and gas and aeronautics firms, not to mention a baseball team, suggests that the era of supercomputers as the preserve of governments is coming to an end.
Hear more about big data from industry experts and IT professionals at Computing 's Big Data & Analytics Summit 2016 in London in March. Find out more and reserve your place now.