Shortcut Time-to-Market with Automated Code Testing

John Chang, Head of Solution Design, CAST
310
582
121

John Chang, Head of Solution Design, CAST

The evolution of software development methodologies and processes in the last two decades—Agile, Test-Driven Development, Continuous Integration, and DevOps—have all emerged based on a common goal: getting high quality products to customers faster. In fact, the challenge of balancing high quality with speedy delivery underpins all the conversations I have with CIOs and senior application development leaders.

The key is waste. Or, more specifically, minimizing waste in the development pipeline. In the context of testing, waste accumulates when non-functional defects are passed through the quality assurance (QA) process all the way to user acceptance testing (UAT) without being addressed. When defects are finally detected in UAT (or, worse, after deployment) the code is sent back to technical support to diagnose, development to revise, and QA to re-test much of what has already been tested. This is waste. And, it is costly, time-consuming, and demoralizing. 

Implementing an automated code review solution in your development cycles will remove a remarkable amount of waste. Peer code reviews are a good idea in theory, but its value diminishes in large enterprises with dozens of developers, engineers, and architects working together on the same system. Automated code testing can provide a consistent and untiring evaluation of whether code is ready to be tested. Said differently, by verifying that source code is free (or has an adequately low level) of coding errors before releasing it to QA for testing reduces the amount of waste in application development. 

But, wait. It is not that simple. You cannot just buy the cheapest code analysis tool off the shelf and tell your development team to use it. To successfully integrate automated code testing into your process and shorten time-to-market, you must:

1. Break Down Development Silos

The most damaging software defects occur when different components interact, especially when they are different technologies (for example, presentation framework retrieving data through business logic source code). Many development teams are set up in silos where cliques form around the components that they work on. Automated code analysis solutions must be good at detecting errors, but they must also be good at analyzing the interaction of different parts of a whole application. These tools must facilitate conversations across technology silos, and encourage teams to work together to create more cohesive software. When selecting a code analysis solution, make sure it addresses the interaction of the technologies within your portfolio.

2. Take a Risk Tolerance Approach

Non-functional defects are caused by errors in coding that any developer can make, no matter how experienced she or he is. The likelihood of errors causing a serious incident depends on their severity and density. In other words: How bad are the coding errors, and how many of them are there? Having entirely no errors is not reasonable in many instances due to cost and business requirements. Therefore, you must be clear about the tolerance for risk. For example, you can set a risk density metric such as number of allowable errors per thousand lines of code (KLOC), or determine a tolerable change in risk such as a maximum of 1 percent increase in errors compared to previous releases. Engage your code analysis solution provider to help you baseline risk in your applications.

3. Enable Team Leaders to Steer

Statistics and metrics from automated code analysis must be aggregated at the application and portfolio level. Application team leaders or owners must be able to look at the results as a whole and assess which areas are most critical to address in order to meet the risk tolerance level set by the organization. Does the application show signs of increased security vulnerabilities? Have the latest code changes affected its performance? Team leaders and managers must have a good understanding of these elements of software health, and make informed decisions of where to focus their resources, instead of tasking developers with arbitrary hygiene and syntax fixes.

  ​You cannot just buy the cheapest code analysis tool off the shelf and tell your development team to use it   

Senior leadership must also be able to gain insights at a group or portfolio level in order to evaluate the risks within their own respective scopes. By providing direction based on business requirements for stability, performance, security, or maintainability of applications, senior leadership can steer the organization to make the right investments. Look for automated code analysis solutions that provide meaningful metrics and insightful analytics based on industry best practices.

4. Give QA Team Authority to Reject Code

It is common knowledge that shifting left, or detecting and addressing issues early in the process, results in efficiency improvement in the development process. For an additional boost to the return on automated code testing, the QA team should have the ability to reject code that does not fall within the tolerance level set by the organization. Repeated rejections from QA will motivate the development team to improve the quality of their deliverables overtime, whether it is by brushing up their coding best practices or increasing their usage of automated code analysis tools. This concept applies even where the QA teams are integrated into the development organization or centralized as a shared service.

5. Stand Behind a Standard Process

Do not underestimate the weight your words carry as a leader. Simply providing a technology tool is not enough. Be clear on how it is to be implemented and who is accountable for the results. Further, be precise on your expectations of how it will improve efficiencies in your organization, setting incremental targets over a period of one to three years. Encourage adherence to the process and regularly report trending of the improvement targets you have set, and remind your teams that you are reviewing these numbers regularly (so, be prepared to talk about them at all times!).

By reducing the amount of waste in your software development pipeline, you can increase deliverable quality, deliver products faster, and energize your organization. Automated code testing, if implemented well, will minimize waste in the process by shifting the ownership of quality to the left, and boosting QA’s confidence in their own testing cycle. Companies who rigorously implement automated code testing have seen maintenance costs cut by as much as 50 percent and incident rates reduce by up to a third. This is the path to better quality and faster releases.

Read Also

Cloud Adoption-The Key to Business Success

Pankaj Sabnis, Principal Architect, Cloud Computing, Infogain

Software Quality in 2016: The State of the Art

Capers Jones, VP & CTO, Namcook Analytics LLC

Onshore, Offshore, and Models for Testing Teams in Light of Recent Data Breaches

Jennifer Bonine, VP, Global Delivery and Solutions, tap|QA LLC

Easing the QA Money Drain

Chris Lawson, Director, Client Delivery, Zenergy Technologies