Hey, fellow Leader 🚀,
I am Artur and welcome to my weekly newsletter. I am focusing on topics like Project Management, Innovation, Leadership, and a bit of Entrepreneurship. I am always open to suggestions for new topics. Feel free to reach me on Substack and share my newsletter if it helps you in any way.
This article will continue with the notion of managing costs related to automated testing. The first article covered some initial needs like training and defining a testing strategy. Once the data is on our side we can start planning and setting into motion the strategy that has been defined in previous steps. The link for the first article is shared below.
Budget and maintenance
A way to provide visibility on testing costs is by making sure the team follows a linear path to identify the areas of improvement and estimate the effort to tackle them. Otherwise, what typically happens is an ongoing effort to maintain and build new automated tests without control, diluting the effort on the development or acceptance tests phases.
In Software Development, there is a natural curvature of complexity that keeps increasing every time new features are added, which in consequence increases the cost of implementing new changes. Without managing the costs related to tests, eventually, it becomes difficult to distinguish what is isolated cost development, configuration management, or test maintenance. The best approach is to work with the team to make a clear distinction between these tasks and report time with that distinction in mind. If we want to be a bit ironic, developers love to add details to their timesheets.
On new releases, there will be always a cost regarding the maintenance and building of new tests. I worked with teams where an estimated 10% of the development effort was dedicated to automated tests alone. Is important to define a threshold to keep these costs controlled with close attention to the team’s feedback, in case they report the allocated effort is not enough.
When the cost of testing is beyond a certain threshold, the team should reflect and go back to drafting their testing strategy. The learning process should be continuous and naturally accepted. With time code becomes obsolete and features are discontinued, dead code need to be removed, and with them, the automated tests will only add up to the overall costs of maintenance. Will be important to delete tests that are no longer used or needed for maintaining a lean test suite.
Testing and DevOps
A DevOps strategy is heavily important to make sure these tests are executed on every change. I am a big fan of DevOps metrics in conjunction with code reviews. The code reviews are a great way for the team to share knowledge and build responsibility among peers.
The integration with DevOps guarantees the continuous usage of the team’s defined test strategy and tries to catch regressions or other bugs before the code arrives in Production. However, for DevOps to work the way the tests are designed and built has paramount importance. If the tests are built just in light to see if a field is bigger than 0 and it accepts only numbers, there won’t be any significant safety net before the code arrives in production. However, if the test suites make sense, and are based on real-world use cases, it will potentially save a lot of headaches in the future.
DevOps is not only a platform but a mindset. In a sense of providing the developers with the tools and the space for continuous learning and continuous improvement. Deploying new features seemingly into Productive environments requires well-designed tests, which are built based on feedback from different product stakeholders.

However, if the tests are not maintained properly, they can provide a fake sense of security and deliver errors in production environments. Spending more days improving a system might be more cost-effective than delivering a broken piece of code that can hurt both the product and the company’s reputation. Just have a look at what happened to CrowdStrike. Their delivery system wasn’t designed properly to deliver a change in such of critical area. Sharing the article below for your curiosity.
The problem with mature and high-capable testing systems is despite everyone’s effort, the system may overlook and deliver a broken feature.
Mindset
The principle in any Agile team that is using any kind of DevOps strategy is implementing feedback loops on the team’s process. Meaning the mindset should be continuously learning and continuously improving. Also, there is no added value in pursuing an objective of a level in code coverage if the tests used to fulfill that objective, keep on breaking on every change and there is no significant test being done that can early detect broken features or bugs introduced on the system. The objective of automated tests is to avoid bugs and not get a score on some dashboard.
In terms of building these tests, the developers should be aware when a flaw is detected later in the software which should have been caught on automated tests. Well-designed systems need a learning cycle, where developers can enrich the test suites based on previous incidents or bugs which can prevent the same situation from occurring again.
However automated tests should not replace the Quality Assurance phase, or even replace it. The automated tests can help prevent serious issues but they are not bulletproof. Good automated systems should not replace tests made by humans with critical thinking. This means the continuous investment in automated testing systems will only decrease the project’s overall costs with QA if this phase is done faster. Even though is not a guarantee that will be cheaper overall.
That’s it. If you find this post useful please share it with your friends or colleagues who might be interested in this topic. If you would like to see a different angle, suggest in the comments or send me a message on Substack.
Cheers,
Artur
Is the actual cost line item labor hours incurred to conduct the tests or software cost to run and iterate the tests?