Test Code
For a long time, when I first started writing tests, I felt so unproductive writing tests. I would try to write the test as fast as possible so I could move on to the "real" code.
Then one day when a production deployment failed due to missing a simple test I realized the critical value of tests, and that good tests are in fact "real" code and they do deliver immense value. I began to see tests as "support beams" for the application that prevent it from collapsing.What are good tests? Good tests are ones that fail only when there is something really wrong. Like the logic was altered unintentionally by some change. Good tests don't depend on data that might change. Good tests happen when you have complete control of all the inputs. If your test is failing when nothing is wrong then make it top priority to fix it, or delete it. No test > not working test. These tests will make you no longer trust your test results and they will lose value. You need good dependable tests to be the support beams of your application.
When you delete a test that is not working you have to make sure that there is still some test that covers the code. You can do this with a code coverage tool or do what's called the Poor man's code coverage. Break the code and see what tests failed. This technique is useful even when you have a code coverage tool. The code line may show as covered but the test may not be testing all possible outcomes. You can quickly verify if a test is hitting that outcome by breaking the code and seeing if a test fails appropriately.Another barrier I found that made me not want to write tests was an unstable environment. I was using this BDD tool that had a habit of spontaneously not working. I would go to write a new test and then end up spending an hour trying to get the tool to function and feel incredibly unproductive. This flakey framework syndrome is not worth any benefits they provide and I would recommend ditching them if you can. If you are forced to use it, document the time you spend getting the framework running again and let your manager know the cost of using the tool.
If your test framework is slow replace it. Speed is critical. If your tests are slow you are not going to run them as often as you should. In .NET I have found NUnit and xUnit to be way faster than MSTest, especially with parameterized tests.
The final thing that made me not want to write tests was legacy code that had no tests and thus wasn't written to easily support writing unit tests. A lot of the times it was like the code was written purposely to make writing unit tests as hard as possible. This was tricky until I started using a "shoring" technique. In construction they will sometimes use a temporary support apparatus called "shoring" while they build in the long term support beams. In coding the "shoring" is a hard coded integration test that will likely fail in the future when the data changes or a service changes, just to use temporarily, to shore up the code while I safely refactor it to something that makes the test easy to write and then I can write the easy test.
Comments
Post a Comment