Automated testing is crucial to support DevOps
Continuous development requires continuous testing, and development teams are key
"Everyone in this room has tested something," said Ben Riley, DevOps specialist at CA Technologies, at the opening of a talk on continuous testing at Computing's annual DevOps Summit today. He was joined on-stage by Richard Jordan, practice manager at Nationwide Building Society.
Testing as a concept and practice hasn't changed in 30 years, said Riley. It begins with defining an idea and writing requirements; then there are many, many steps for testers to rationalise it into something that makes sense.
DevOps at Nationwide began about 18 months ago, but has unofficially been ongoing for at least the last five years. It has not reduced the need for testing, though - in fact, it's done the opposite. The feedback from testing is the primary method of measuring progress.
There are two methods of testing, which Riley and Jordan call ‘ice creams' and ‘volcanoes' due to their reversed pyramid structure. ‘Ice creams' are the old form: they begin with large amounts of manual testing and funnel down to small amounts of unit testing. ‘Volcanoes', on the other hand, are reversed; thanks to automation, this method covers huge numbers of unit tests and smaller amounts of API, integration, component and GUI tests. The smallest segment of the volcano is the peak: manual session-based testing.
Ice creams face multiple challenges; traditional manual UI testing is slow, and doesn't lend itself to stable, repeatable testing at scale. Under the volcano method, on the other hand, development teams are encouraged to perform their own automated tests ("quick to build, stable to run") in order to "Fail fast and fail early," when bugs are fast and cheap to fix.
Clearly, continuous testing through automation is key to supporting DevOps' continuous development. Nationwide is moving away from ice creams and towards volcanoes, but still has some way to go: "There won't be a big bang, but we'll chip away at it," said Jordan. Part of that is selling the idea to testing teams by installing an automated testing mind set from the start. His team is also addressing technical and design debt at the same time.
There has been some pushback from coders, who are now being asked to do their own testing, but it's a mind set change. Jordan has been countering it with a bit of carrot (offering new tools) and a bit of stick (this has to change).