Data-as-a-service? How Delphix's approach to DevOps helped it beat big-name rivals at Computing's Vendor Excellence Awards
Key points
When Delphix won the DevOps Solution Award at Computing's Vendor Excellence Awards in July, it didn't come as a surprise to Jes Breslaw, EMEA director of marketing and strategy at Delphix.
Data is one of the key challenges facing DevOps teams, so any tool that promises to solve that problem is bound to stand out
When Delphix won the DevOps Solution Award at Computing's Vendor Excellence Awards in July, it didn't come as a surprise to Jes Breslaw (pictured above, centre), EMEA director of marketing and strategy at Delphix.
While there are plenty of good solutions that can aid organisations' shift to a DevOps style of working, there is a gap when it comes to streamlining the access to corporate databases that all applications rely on.
"Why is data such an issue? It's an issue in a few ways. From a time point of view, getting new data into a development or test environment is a slow and manual process taking operations days, weeks or even months. With sizes of databases exploding into hundreds of terabytes, even petabytes as with some of our customers, it slows down the software development life cycle," says Breslaw.
Then, as you move through the development life cycle, data needs to be periodically refreshed, and test data needs to be reset or rewound. This can adds weeks and months to software development, resulting in project schedules that slip.
In response, organisations are forced to enact workarounds with data sub-sets, or by using synthetic data. "This affects the quality because you're not working with a true version of your data. As bugs stack up, it takes more time to fix, adding yet more time to the project. Then, there's the whole idea of self-service and automation.
"One of the main themes of DevOps is to take some of the more mundane operational tasks, such as data refresh, reset or rewind, and to automate them or provide them to the developers via a self-service tool. Delphix, again, can help. Copying data from one environment to another is a tiresome task - database administrators spend 90 per cent of their time doing that. Delphix can eliminate that process," says Breslaw.
By implementing Delphix's data-as-a-service platform, application developers and testers are able to use near-live copies of production data, without requiring large-scale support, and without the storage implications from duplicating data multiple times.
"Now, whether you want to automate via Delphix, or integrate with DevOps tools, such as Chef, Puppet, Jenkins and so on, or you want to introduce self-service and want your development team to be able to spin-up environments on the fly, they can also do it themselves.
"The benefits are huge. With data-as-a-service, project times are typically halved. Data-related errors are eliminated and as much as 90 per cent of storage costs are removed. But perhaps an even bigger change that Delphix impacts is that developers really don't have to worry about infrastructure any more.
"They haven't got to worry about how much storage costs. They don't have to worry about getting their infrastructure team to set up an environment or worry about the impact on other projects. They can simply get on with doing what they do best, creating applications," says Breslaw.
Delphix's data-as-a-service software sits as a layer between storage and applications. "Instead of copying the data, we're virtualising it and sharing that data block between 10, 20, 200 - whatever number of environments the organisation is running."
For example, one Delphix client, an online retailer in the US, has 130 developers. Prior to implementation, it was doing DevOps, but only had seven shared environments. "With Delphix, they now have more than 200 shared environments," says Breslaw.
Similar to server virtualisation, whereby you share, for example, a single CPU between multiple virtual servers, Delphix does the same with data. When you install it, it makes a one-time copy of production data, which is compressed to about one-quarter of the size.
"These ‘virtual databases' or data blocks then stay in synchronization with production and can be provisioned, though they are not real copies; they can be shared and re-provisioned over and over, which is where the storage savings and speed come from," says Breslaw.
That's why the company refers to its technology as "data-as-a-service", he adds. "If you look at what 'as-a-service' is, it's on-demand and centrally hosted. That's exactly what we're doing with data: we're placing all of your non-production data from across your organisation into a single place and then enabling access to it on-demand. That is a fundamentally new technology," says Breslaw.