Executives' Secret Weapon: Continuous Testing

Alex Martins, CTO/Advisor - Continuous Quality, CA Technologies [NASDAQ:CA]
272
466
88
Alex Martins, CTO/Advisor - Continuous Quality, CA Technologies [NASDAQ:CA]

Alex Martins, CTO/Advisor - Continuous Quality, CA Technologies [NASDAQ:CA]

I work with executives across different industries and they’re all focusing on defining new business strategies to effectively navigate through today’s highly competitive and complex market. All of them recognize technology is at the core of those strategies as the enabler to implementing successful digital transformation initiatives that support the execution of those new business strategies.

These initiatives require existing software applications to be changed and new ones to be created from scratch. There are two main concerns from executives at this point:

1. Business operations can’t be disrupted

2. Any application code (due to a change or new application) being deployed to production systems has to be of the highest quality

A third concern that has become more prominent in the past couple of years is speed, measured through time to value (time from having an idea until the customer receives the intended value/benefit).

Most of these executives have spent their entire budget trying to unsuccessfully tackle concerns #1 and #2 above. Trying to address those two items is hard enough. Doing it with speed is exponentially harder to accomplish with traditional technology, process, and skills.

Agile and DevOps to the Rescue

That’s where agile development methodologies and DevOps help organizations accelerate ideation to application development and streamline the handoffs across teams involved in the SDLC value stream. Continuous Delivery has brought the acceleration necessary to tackle concern #3 above.

As I work with different organizations, they’re slowly being able to accelerate code deployment without causing any outages to the business. Even the more traditional ones still in the Waterfall era. But an interesting side-effect is also being noted by executives. The amount of rework has gone up considerably.

Yes, organizations are starting to bring the idea or change to life by implementing it in the form of code very quickly. And that code is starting to flow across the different environments (e.g. Dev, QA, Stage, Prod) seamlessly and without manual intervention. However, once the code hits production, the expected benefits are not always being realized by the business or even the customers.

When I look at modern software development approaches, I like to think of quality in terms of the 4 areas below.

In traditional organizations, executives are seeing perfectly good quality code being deployed quickly, but application quality is still low. If the value is not realized by the stakeholder (business or customer), then overall application quality is poor. That’s how executives are looking at it.

Where’s the disconnect?

Further analysis by the teams has been showing most of that lack of alignment is due to poorly defined requirements (e.g. user stories are too high-level and weak acceptance criteria) and that the code that makes it to production hasn’t been tested thoroughly enough. The reason for the latter is due to most testing still being performed manually by development, testing, and business analyst teams.

In other words, the requirements intake process and the code pipeline orchestration have been improving as techniques and technology have evolved, but the testing across the SDLC continues being performed in the traditional ways. That means heavily manual and the automation focusing on regression tests only. Still, that automation requires a lot of effort to maintain as changes happen in every code iteration. So automation is always lagging behind with an army of dedicated automation engineers.

So if you’re reading this and the scenario resonates with your organization’s current state, there is no need to panic. This is normal and can be fixed. It’s just part of the effort for evolving your testing techniques and tools technology so that you can ensure the requirements intake and code pipeline orchestration processes are effective: i.e. they deliver quality code to production with the speed expected by the business and customers.

Evolving your Quality Processes: How to Start

Value stream mapping. You must work with your organization leaders (e.g. Business Analysis, Development, QA and Operations) to ensure they thoroughly understand what it actually takes to take an idea, code it, test it and deploy it in production. As your leaders do that, they will likely find long waiting time in each of these disciplines. Most notably, in the software testing one.

Next step is to ensure the inputs needed for testing to take place (i.e. requirements. Note: a user story is a form of requirement) are known and understood by developers and testers. They don’t need to be documented in the written form. In fact, it’s more beneficial if they’re not. The written language is inherently ambiguous, and we all know ambiguity leads to defects. The figure below is a good illustration of how ambiguous requirements pose a threat to quality.

It is very common for team members to draw flowcharts or activity diagrams on a whiteboard or piece of paper to explain a functionality or feature. As long as all team members start the iteration with the same understanding and they leverage technology to take that diagram and automatically generate the optimal set of test cases to achieve maximum coverage then developers will be able to create their code to pass those tests (think ATDD– acceptance test driven development) and testers will focus on using more modern technology to bring automation earlier in the cycle, inside the iteration or sprint. As opposed to just thinking automation in terms of regression tests.

If you think about what this picture above means, developers will write code correctly the first time, which leads to less rework due to defects. Testers will focus on doing in-sprint automation, which will reduce the testing cycle time. Overall quality of the code that gets deployed in production will be higher as well as the quality of the application itself will be higher as the business and customers will realize the value quicker with a quality deployment.

Continuous Testing Enabling Digital Transformation Success

Many of the executives I work with are starting to see their teams achieve the desired acceleration and quality across legacy and new applications. By shifting testing activities to the left of the SDLC and focusing on modernizing test automation techniques and tools, teams are being able to continuously test the application code at every code check-in at the unit, system, system integration and performance test levels.

Continuous Testing is becoming a secret weapon to executives looking to truly accelerate time to value. Any digital transformation strategy that focuses solely on accelerating code development and deployment by leveraging modern techniques such as TDD and pipeline orchestration will quickly realize business value is not being perceived by customers. Software testing is key to ensuring that not only quality code is built, but also that a quality application is deployed to the hands of your customers. Big difference!