Whatever your industry, digital transformation is unavoidable. All this change has initiated a shift in the way enterprises deploy new software. IDC research has found that 40% of organisations now use microservices and containers to build and manage their application portfolios, with that percentage expected to reach 50% by 2027.
But if these systems are far more flexible than their monolithic forebears, boosting scalability and speeding up development cycles, their very versatility can also bring challenges, especially when it comes to quality assurance. As an increasing number of enterprises are discovering, these problems can readily be overcome – not least by leveraging the immense power of autonomous testing.
Manual labour
Few people appreciate how quickly application development is moving better than Nagendra BS. The VP of digital assurance – practice and solutions at Hexaware, he’s worked at the Indian IT service management company for the best part of the past two decades. And, as Nagendra says, the adoption of microservices and containers, as well as the DevOps philosophy writ large, can best be understood in terms of development speed.
“These technologies help keep change rapid,” Nagendra explains, adding that giants like Netflix and Microsoft can now publish code updates every single minute. Nor are these mere hypotheticals. With a sprawling, complex application infrastructure to supervise, for instance, Amazon engineers alone deploy code every 11.7 seconds.
This rush would naturally be impossible using so-called ‘waterfall’ techniques, whereby errors are painstakingly resolved before applications are released to the public. On the contrary, the cutting-edge coding involves pumping out tweaks at speed, siloing updates and fixing them on the fly.
But as Nagendra explains, microservices and the like are only as good as the testing regimes underpinning them, noting that quality assurance can quickly become a “bottleneck”, struggling to keep pace with developers. That’s often compounded by just how much testing still relies on human intervention. As Nagendra puts it: “They simply can’t keep up.” That’s doubly true given that line-of-business personnel are often unfamiliar with the intricacies of modern development, putting unreasonable pressure on engineers.
To counter these challenges, some enterprises have implemented traditional test automation into their testing regimes. These traditional test automation approaches still have a high level of dependency on human intervention for performing required testing activities across the SDLC – including development and maintenance of automation scripts, impact analysis, and learning from different data sources like application logs – while a machine carries out the legwork.
Machine learning
Automation can certainly be a boon for overworked quality assurance professionals. Yet as that last example implies, humans are not removed from the testing equation completely. As Nagendra warns, this can keep costs high, even as applications remain vulnerable to glitches or cyberattacks.
With this in mind, dynamic businesses are moving from automation to fully autonomous testing. Dispensing with human intervention altogether, autonomous testing relies entirely on AI and machine learning (ML), an approach useful everywhere from analysing test results to bug triaging.
Typical here is Hexaware’s own work. The company’s Autonomous Test Orchestration Platform (ATOP) is a unified plug-and-play system that supports traditional test automation and autonomous testing, boasting increased efficiency and better ROI, among other benefits. ATOP can be a one-stop shop for all functional and non-functional testing needs across the UI, API and data layers of the application
Not that the path to fully autonomous testing is straightforward. Perhaps most fundamentally, Nagendra says that embracing systems like ATOP requires a change in client philosophy. “There will always be some hesitation in investing in transforming the testing,” he says. “People feel it’s too futuristic.”
Equally important, Nagendra continues, comes the question of expense. Many enterprises have traditionally taken a ‘cost-driven’ approach – meaning they focus exclusively on how much quality assurance impacts their bottom line. But this ignores what Nagendra calls the “outcome” of quality assurance, particularly in terms of speeding up testing cycles and portfolio quality.
Whatever the challenges, at any rate, it’s clear that companies battle hard to mitigate them. In the case of Hexaware, Nagendra works to explain the advantages of its ATOP system, especially vital if an enterprise has already tried regular automation but saw no appreciative jumps in ROI.
From there, Nagendra and his team take time to speak with DevOps teams to understand how they can assist. This is often accompanied by practical investigations. To explain how this works in practice, Nagendra gives the example of one hypothetical client.
“We organise a structured two-week exercise,” he says, “wherein we look at every activity that needs to be done across the software testing life cycle.” Only after identifying where autonomous testing could help, adds Nagendra, does implementation actually start.
It matters, too, that the Indian IT giant offers a holistic service, with both out-of-the-box autonomous testing and customised alternatives. For more hands-off enterprises, Hexaware proposes three to five-year managed service contracts, up and running in as little as a month, and where Nagendra and his colleagues take “complete responsibility” for a client’s quality assurance, with committed outcomes for cost, quality and speed.
For more specialised clients, meanwhile, integration is made simpler thanks to ATOP’s open-source structure, which means that even Hexaware’s competition can happily be exposed to it.
Autonomous test cases
Examine the statistics and this thoughtful, flexible approach clearly pays dividends. All told, ATOP enjoys over 200 autonomous testing use cases, spanning script generation, maintenance root cause and failure analysis through test execution and performance modelling.
ATOP’s ROI figures are striking too. With 100% test and automation coverage, enterprises can expect quality assurance cycles four times faster than those reliant on human intervention. That’s shadowed by incremental savings of up to 70% on quality assurance, and critical defect slippage dropping near zero. With percentages like that, no wonder Nagendra has seen ATOP adopted by a number of Hexaware partners.
“It has been tried and tested,” he says. “We just have to start using it.” Given how exhausted modern quality assurance professionals would get otherwise, that’s surely just as well.