TDD and Accuracy vs. Precision

Listen to this post
Subscribe

It is important that the system behaviors we create as developers are both accurate and precise. These are not the same. One can be:

  • Accurate, but not precise, such as “Pi is a little over three.”
  • Precise, but not accurate, such as “Pi is 5.393858303895.”

They are not the same, but they are related. The level of precision required tells you how far to go in determining accuracy.

Here is an example. The system is required to trigger a process “immediately after midnight.”

“Immediately” can only be determined to be accurate if you know the level of precision in the requirement. Is it one second after? One tenth of a second? One hundredth? By “after,” do you really mean the behavior should be triggered at midnight, since computers do not behave instantaneously? Should you use the clock speed of the machine to determine what “immediately” means?

One big advantage of TDD is that automated tests impose rigor. They do not allow us to accept requirements without truly understanding both the accurate and precise version of them, because we can’t write the test code correctly without this information.

If developers see the unit tests as specifications and understand that they are not supposed to “make things up,” then when questions like these arise, they are forced to get more detail.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.