In the race to implement continuous—or simply faster—delivery, emphasis on the build-and-deploy side has gotten most of the press, while continuous testing has languished, according to technology analyst Theresa Lanowitz. All too often, the first technologies associated with virtualization are VMware, hypervisors, Microsoft Azure or the cloud.

“Part of the problem with service virtualization is that the word ‘virtualization’ has a strong attachment to the data center. Everyone knows the economic benefits of server virtualization,” said Lanowitz, founder of analyst firm Voke, who’s been talking about the topic since 2007. “A lot of people don’t know about service virtualization yet… It’s moved from providing virtual lab environments to being able to virtualize a bank in a box or a utility company in a can so there are no surprises when you go live.”

No time for testing
Testing remains a bottleneck for development teams, or, worse, a luxury. Just ask Frank Jennings, TQM performance director for Comcast. Like many test professionals these days, he faces many scenarios for exercising a diverse array of consumer and internal products and systems.

“The real pain point for my team was staging-environment downtime,” he said in an October 2012 webinar moderated by SD Times editor-in-chief David Rubinstein. “Often, downstream systems were not available, or other people accessing those dependent systems affected test results.”

(Service Virtualization: A webinar from SD Times)

Automating the test portion of the life cycle is often an afterthought, however. “People are looking for operational efficiency around the concepts of continuous release. We walk into companies that say ‘We want to go to continuous release,’ and we ask, ‘What are your biggest barriers?’ It’s testing,” said Wayne Ariola, chief strategy officer for Parasoft, a code quality tool vendor.

“Today, I would say 90% of our industry uses a time-boxed approach to testing, which means that the release deadline doesn’t change, and any testing happens between code complete and deadline. That introduces a significant amount of risk. The benefit of service virtualization is you can get a lot more time to more completely exercise the application and fire chaos at it.”
#!The costs of consumerization
While speed to market tantalizes, the costs of failure are ever greater as software “eats the world” (as Marc Andreessen put it). Even the most mundane of industries is not immune to the powerfully fickle consumer.

Ariola uses the example of a bank that couldn’t innovate fast enough when its competitors made game-changing moves such as automating savings or creating smartphone check deposits. “When one bank advertised the feature of taking a picture of your check with your phone to deposit it, all the other consumer banks had to get this thing pretty fast. You need speed to differentiate in business, but the risk of failure is so much higher,” he said.

About Alexandra Weber Morales

Alexandra Weber Morales is a freelance writer (and singer and songwriter).