Agile is neither a well-understood development methodology nor a well-followed best practice. It has potential to be both. But many organizations do not succeed because they fail to incorporate teams across the application life cycle.
In the July 2012 Market Snapshot Report from Voke’s Lisa Dronzek and Theresa Lanowitz, only 28% of survey respondents reported success with agile. The report asserts that agile appears to be shifting the software engineering process to one that is focused on development—“to the exclusion of QA and operations.”
When implemented correctly, agile encompasses the entire application life cycle. It unites business, development and infrastructure with a common focus on speed, quality and value. This focus culminates in application performance on the part of the customer or application end user. As performance continues to be the most critical factor affecting adoption and use of an application, agile shops must focus more on improving speed, measuring quality, and delivering value.
Speed in agile accounts for time-to-market or time-to-acceptable-completion of an application. It refers to the time it takes from idea conception to delivery of a product to the customer. “Acceptable completion” or “Doneness criteria” are paramount. In an agile process with continuous development, testing and deployment, a deliverable is only considered done when it meets all doneness criteria, both functional and performance.
The agile process should build in performance management throughout each stage, from concept to support. This does not put speed at risk. Build-test-deploy automation provides a reliable and fast way to verify features and functions early and often throughout the development life cycle. This automation, combined with an agile process that has garnered acceptance throughout development and infrastructure, speeds the delivery of highest-priority features and functions to the customer.
While time-to-market can be measured from idea to completion in finite terms, measuring quality in a delivered application is more ambiguous. Organizations will often measure the number of defects found in development and compare that to the number of production incidents discovered by infrastructure (or worse, the customers).
On the surface, this is a good objective measurement. However, quality must be weighed with subjective application defects. The customer’s perception of failure is arguably more important than discovered defects when it comes to user adoption and acceptance.
Understanding the tolerance or patience of users for any particular transaction is critical. Performance failures should be considered functional failures. If a transaction takes 30 seconds to complete and users do not wait around for results to be delivered, the function itself has subjectively failed.
Establishing performance thresholds within the build-test-deploy process helps detect both subjective and objective failures earlier in the application life cycle, along with automating the pass/fail criteria for each stage in the life cycle. This provides more value from your automated build-test-deploy system, provides more time for remediation of critical failures, and avoids potentially damaging negative customer experiences.
Ensuring both objective and subjective quality does not guarantee an application will deliver value to its users. Value is also a twofold tenet of agile. Perceived value and actual value to a customer or user must be determined. This value should be expressed as success in delivering specific or requested features or functions, as well as whether or not the application meets (or better yet, exceeds) expectations. Value to the business or application owner should be evaluated in terms of market position, brand, productivity and revenue.
As the statistics in the Voke report show, it is not straightforward or easy for organizations to successfully adopt agile. This is achieved when successful business stakeholders have a high-performance culture that delivers the highest-value features to its customers, in the fastest time possible.
To achieve this and to ensure the appropriate focus on performance throughout the application life cycle, organizations must:
1. Integrate or break down the barriers that exist between business stakeholders, customers, development and infrastructure teams. Agile demands collaboration and alignment throughout the application life cycle. Teams should unite around a common focus on performance and the customer.
Best practices for bridging this gap include enabling the performance engineering team to work directly with disaster recovery and capacity planning teams. In this situation, if capacity planning forecasts a sharp increase in users due to a successful marketing campaign, performance engineers would execute a number of test scenarios to mitigate potential risks. Disaster recovery would test resiliency and prepare for both failover and fallback scenarios.
2. Leverage automated tests (build-test-deploy systems) and “quality gates.” As mentioned, automation is essential for agile. Automation allows critical procedures, like testing, to run in the background throughout the development process.
Establishing quality gates enables a set of “pass” parameters that must be met for each build to receive the quality stamp. These parameters, and how quality is determined, will vary as the application progresses through the development desktops, build verification test, quality assurance, preproduction and production environments. Quality gates prevent builds from progressing from one environment to the next until acceptable performance and usability thresholds are met, while maximizing automation to quickly deliver quality.
An increased focus on performance with speed, quality and value is fundamental. Integrating this focus with maturing agile teams and customer-driven collaboration results in a customer-centric—not a developer-centric—movement. This customer-centric movement does not and cannot exclude QA and operations because customer experience and perception are paramount.
This is agile implemented correctly. This is agile improving application development and deployment. Agile is more than a daily standup and having a backlog. It is more than the statement, “Yes, we are agile.” It is an integrated process, across the application life cycle and organization culture, including the customer, which incorporates performance in every stage and delivers on the promise of speed, quality and value to the customer!
Todd DeCapua is the VP of Channel Operations and Services for Shunra Software, which sells application performance-management tools.