Digital disruption has fundamentally reshaped the business landscape over the last two decades, and this past year the trend has accelerated in a way that few could have predicted — making existing digital transformation plans urgent. To meet the surge in digital demand, enterprises are accelerating plans for cloud migration, DevOps transformation, and enterprise application modernization. However, many organizations and CIOs are realizing how complicated modernization can be.
The highly integrated nature of most enterprise applications further complicates efforts. According to MuleSoft’s 2020 Connectivity Benchmark the average organization today uses more than 900 applications, and a single business workflow may touch dozens of these applications via microservices and APIs.
To ensure business processes keep running, testers must replicate the work users perform across multiple applications and ensure none of those workflows are impacted when any of the applications are updated. That means that tests must work seamlessly across multiple applications, architectures, and interfaces. Whether your organization is adopting new technologies, incrementally modernizing legacy systems, or both, leveraging automated software testing means there’s no striking a balance between innovation and risk mitigation — you can go fast, safely.
Align testing with DevOps and Agile models
In order to align innovation and testing, many organizations are shifting enterprise application delivery to Agile and DevOps models. While this is an invaluable approach to bring about new software updates and features, it also tightens the timeline to thoroughly test new code — which is often reported as the biggest delay in the delivery process.
Therefore, modernizing software testing represents the most significant opportunity for improvement in delivery timelines. The faster delivery teams can ship updates, the faster the organization can build innovation that streamlines business processes and unlocks new streams of revenue.
Hypercare puts you at risk
To overcome testing challenges, some organizations have pivoted away from extensive pre-release testing in favor of hypercare. This is a post-release period that can last weeks or months in which an organization’s most talented (and usually most expensive) resources dedicate their time to quickly fixing defects in production. Hypercare recognizes that business users are unlikely to catch all defects in pre-release testing, but it is not an adequate replacement: tying up resources means the innovation backlog continues to grow, costs rise and so does risk.
Organizations are better off learning to prioritize and deliberately test specific aspects of their software using a risk-based approach. However, designing an effective risk-based testing strategy requires collaboration between business users and IT — and this is much easier said than done.
Use a risk-based approach
Risk-based testing improves quality and reduces production defects. It can also significantly reduce software testing effort. This approach shifts the focus from test coverage to risk coverage. While both metrics are important, incorporating risk coverage into the testing strategy makes it easier to align testing activities with business objectives.
Risk coverage tells you what percentage of business risk is covered by test cases. It is often the case that 10% of a test suite will cover 80% of the business risk. This approach significantly reduces the number of tests needed to deliver high-quality releases to production.
A risk-based approach is also helpful in the event of a major upgrade or migration. Some enterprise application vendors offer readiness checks, but enterprises should consider seeking advanced assessments from third parties. These advanced assessments focus on identifying the impacts of the upgrade on development, testing, integration, and security. The result is a clear picture of the risk the upgrade poses, the tests that need to be executed, and the test cases that are redundant or no longer required in the new environment.
Scale test automation
A key to cutting costs lies in an organization’s ability to efficiently scale test automation. With test automation, organizations can significantly improve test coverage and catch defects much earlier in the delivery lifecycle, when they are less expensive to fix.
Organizations can scale test automation both horizontally across applications and vertically within an application with low-code and no-code solutions. These solutions can be adopted and learned quickly by existing testing resources, regardless of technical skill. A model-based approach extends the benefits further, by organizing test cases into reusable building blocks that can be repurposed across projects and teams. By increasing test resiliency, this approach eliminates the high maintenance costs associated with script-based tools.
As the infrastructure that supports the whole business, enterprise applications are central to both internal and customer-facing innovation. Modernizing this infrastructure gives enterprises a solid foundation for digital transformation — but only if they can do so without introducing business risk. The complex nature of these applications, coupled with misconceptions about the level of testing required, means many enterprises face an uphill quality battle that could stall transformation efforts.
Enterprises that recognize the importance of a modern testing approach that is aligned to business objectives, measures risk, and can effectively scale upwards, stand to make significant gains in 2021.