Twenty-five years ago, I went from working for one startup to what was, for all intents and purposes, another startup. The new "startup" CEO had purchased a software company from a major hardware manufacturer. His goal was to revitalize the product. And for the next twenty years, I helped him do that twice.
When I joined the company, there were three types of employees: accounting, sales, and a handful of developers. Product development was not organized, there was no source code control, everyone worked on their own thing, and deployment was a copy-paste situation. While I'm sure the other developers did a bit of testing, most testing was probably done in production (by the customers.) It took a year or two to talk the company into hiring a QA person.
Why did it take so long? Perceived cost. Analyses and studies have shown it's cheaper to find and resolve bugs before release than after. Involving QA early in the product lifecycle costs more upfront but is supposedly more affordable in the long run. I'm sure this is true when considering the risk of losing customers over defects that make it to production.
So, why not just trust developers to test their code? Developers absolutely should test their code. There is a blindness that sets in, though, during development, causing developers to miss things easily spotted by QA. Developers should find most of their bugs before sending code to QA. Quality assurance is merely a backup to ensure a superior user experience.
Developers can reduce the cost of defects and maintenance, though. With automated testing, they can improve the quality of their work and the quality of the products they develop. Developers can write testable code and unit tests using best practices, like inversion of control, isolation, and test-driven development (TDD.)
With test-driven development, developers start with software requirements, write tests to implement the requirements, and then write code to pass the tests. Writing tests first, before writing implementation code, does the following:
- Forces developers to write only the code necessary to pass tests, thus implementing requirements without scope creep.
- Encourages developers to organize code properly, making maintenance easier.
- Guarantees high "test coverage" and thus higher quality solutions.
Do you still need QA if you use TDD? Absolutely. QA is required for end-to-end integration testing. In regulated environments like medical device development, QA is integral to the validation and verification process (V&V), so you can't skimp on QA. A test-driven development methodology, coupled with a good QA team, ensures delivery of the highest-quality product.
In summary:
- TDD is not proper for every software solution.
- TDD is correct for most software solutions.
- Add testing and QA as early in your development process as possible to avoid costly defects and poor customer reception.
Me and my team at Code Scientists have developed software products commercially for decades. We can build anything from enterprise systems to mobile applications to firmware for your hardware solutions.
Got something you want to build? Let's build it together.