The last few years have seen automation become a major focus, as organisations look to accelerate software innovation and sharpen their competitive edge, writes Stuart Ashby, DevOps specialist at Compuware.
Gartner forecasts that spending on Robotic Process Automation (RPA) software alone will reach $2.4 billion by 2022. These capabilities offer significant advantages through productivity, quality and efficiency improvements, and allow organisations to redeploy resources to support higher-value tasks that drive the business forward.
Automation is especially beneficial in helping organisations to bring new digital services to market faster, by removing the need for manual testing during software development. This enables them to improve code quality whilst increasing velocity, so they can deliver new experiences to customers and end-users with increasing regularity.
The consistency that automation provides means organisations can innovate at this faster pace without the fear that increased speed will introduce more bugs and unforeseen consequences. However, despite its growing role in powering customer-facing services, the mainframe is often overlooked as a priority for automated testing, which could be holding back digitalisation efforts from reaching their full potential.
Acknowledging the Phobia
It’s an oft shared fact that most of the world’s biggest banks, retailers and insurance companies continue to run on the mainframe. They remain loyal to the platform due to its stability, security, performance and reliability, as well as its decades of intellectual property in the form of business logic and data.
Forrester found that 72 percent of customer-facing applications at these organisations are either completely or very reliant on mainframe processing. As a result, it’s critical that organisations are able to deliver change on the mainframe as quickly as they can on their distributed systems. This universal agility is essential to the success of any efforts to enhance consumer-facing digital services, such as mobile apps, AI-powered chatbots and e-commerce platforms.
However, COBOL programs are often poorly documented, and experienced mainframe developers are becoming increasingly scarce. As such, it’s easier said than done for many organisations to drive innovation on the mainframe.
Modern IT teams are often reluctant to change anything that impacts mainframe code for fear that they might break something. However, while Homer Simpson did once proffer the wisdom that “if you don’t do anything, you’ll never make mistakes,” that advice has its pitfalls in the ultra-competitive world of today’s digital economy. Fear of failure can in fact become the biggest challenge that organisations face in the race to keep-up with customers and competitors.
Moving Past the Fear
When you consider the volume of mainframe programs that organisations rely on, there’s a major opportunity to increase the pace of innovation if IT leaders can find a way to improve their teams’ confidence in working on these applications. Thorough testing is key to providing that self-assurance, helping IT teams to ensure the quality of the code they’re working on and preventing any unforeseen consequences of change.
However, it’s incredibly time-consuming and labour-intensive to test mainframe code manually, and especially difficult to create test scripts and test data sets without specialist skills. That’s why many IT teams take short-cuts and vital quality-checks that are common in distributed environments, such as unit testing, aren’t conducted on the mainframe.
As a result, mainframe code defects can go unnoticed until later in the development process and aren’t picked-up until it comes to functional, integration, or regression testing, when it becomes more problematic and time-consuming to rectify them. Worse still, there’s an increased risk that a bug will be put into production and go undetected until a process fails, and a customer or business user experiences a problem with a front-end service.
This creates a vicious cycle, where IT teams don’t want to work on the mainframe for fear of introducing problems that they know could have major impact, but they aren’t taking appropriate steps to reduce that risk when they do finally have to make changes. Organisations can solve this dilemma and significantly accelerate innovation by automating unit, functional and integration testing on the mainframe.
Not only does this remove the need for mainframe specialists to create and execute tests every time that code is being worked on, it ensures that problems are identified and highlighted to the IT team as they’re introduced. As a result, developers can fine-tune their code as they’re working on it, and testers can identify any bugs before passing it along to the next stage.
Supercharging the Mainframe
Automating testing is a great place to start, but it’s only one piece of the puzzle in enabling organisations to fully unleash the power of the mainframe to drive their innovation.
As well as ensuring that new mainframe code is acceptable, IT teams also need to make sure it doesn’t negatively impact the availability, performance or functionality of their existing services and wider applications in distributed environments. Automation again holds the key, allowing IT teams to maintain code quality throughout the development process by simplifying every stage of the test cycle; from unit, to functional to integration and regression testing.
By “shifting left” with automated testing and maintaining that thoroughness throughout the development cycle, organisations can ‘fail fast’ on the mainframe, supercharging innovation without fearing the risk of tripping themselves up. Better still, they can improve quality, velocity and efficiency on the mainframe in spite of the growing shortage of experienced developers; giving themselves a major advantage as they look to drive a competitive advantage through supercharged digital innovation.
See Also: IBM Unveils Watson-powered Supply Chain Management Tool at Gartner Summit