“Let me make this clear: Continuous integration will never replace regression testing—regression testing by qualified testers, not by the developers, who understand the behavior of all the features supported in the previous iteration or sprint. As a developer, I only understand the feature I wrote and implemented. Don’t expect me to do a very good job in making sure that all the other features I don’t really understand are still working.”
Hanna is also wary of relying on automation tools driving continuous testing efforts. While manual testing requires a physical tester, scripts govern automation. In this push toward a faster life cycle, he is concerned about developers and testers losing sight of a project’s ultimate goals.
“In order for Continuous Delivery and integration to succeed, they rely heavily on individuals writing scripts for tools,” he said. “The scripts need to be written not only to test the feature being implemented or the feature you are implementing, but the feature we delivered a year ago still has to work.
“There’s always trade-off. Delivering high-quality systems fast means cutting corners, and cutting corners in Continuous Delivery has affected the most critical aspects of the projects: the requirements. I can get developers to write code very fast and push code into production, but what does the code do? Why are we forgetting that we’re only writing code to implement a feature, a requirement or a behavior that the customer wanted?”
The first inning
While the growth of agile and the rise of Continuous Delivery and integration are tangible, continuous testing is still in its infancy. Organizations are still figuring out what it is, and both developers and testers are still in the process of grappling with not only how accelerated testing affects them, but also how to automate it effectively.
“From the standpoint of implementation versus awareness, we’re in the first inning of a nine-inning game,” said SOASTA’s Lounibos. “Awareness is pretty strong. It feels a little bit little 2009 and 2010 in cloud computing. Everyone was talking about it, but there weren’t that many people implementing. Early adopters are out there, but people have to get familiar with what continuous testing even means: How do they implement it? What are the best practices?”
The early adopters are the ones who, according to Applause’s Johnston, are phasing out things like centers of excellence and large outsourcing contracts—the equivalent of a large standing army—for a nimble Special Forces unit, the integrated developer and tester teams implementing automated continuous testing.
“The companies that are trotting out the same playbook of mainframe to desktop and desktop to Web applications are in the tall grass, completely lost in the weeds,” he said. “That’s what it takes: Wiping the whiteboard clean and saying ‘Okay, all the muscles we’ve built in the past 15 years from Web, a lot of those don’t really apply. The big investment we made with this vendor or that longtime outsourcing relationship or that Center of Excellence we thought we’d be using for 30 years, that’s either not going to be a part of the solution as we go forward, or just a part.’ ”
As adoption climbs, testing in a continuously delivered environment is also moving away from a development and testing process partitioned into silos. Think of the developer cliché where someone slides a pizza under the door and out comes code. As developers and testers hop the fence, testing is moving toward a more integrated and virtualized process aligned with a continuous ALM solution.