A long time ago my friend’s mum’s toaster broke, so she sent it back to the manufacturer. They serviced it and returned it back to her with a note saying it was all working, have a great day! She plugged in, tried it out, but it was still broken and didn’t toast the bread.
She rang the company and got through to a support technician who said the toaster was working perfectly when they tested it. Confused, she pressed for more details about how they had tested the product. He went into great detail about the impressive battery of tests they had performed using all manner of high-tech analytical equipment. They’d measured so many voltage reference points, run diagnostic routines on the logic chips, and checked the inductive load on the transformer and so on. All tests passed 100% so they gave it a quick polish and popped it back in the post.
“But did you make any toast?”, she asked him.
Well of course not, he explained. This is a hi-tech, manufacturing clean-room grade area – they can’t have food or crumbs lying around, that would just be crazy. And who’s going to go out and buy all the bread?
The Toast Test has become something of a legend in the various companies I have worked at over the years. I usually introduce the story and it becomes a very simple term to describe what software engineers (myself included) sometimes forget to do when they’ve spent years adopting the best practises of module testing, unit-testing, TDD and so forth. At the end of a long, hard project it’s so easy to forget to return to the original brief, after you’ve got 100% code-coverage and all the tests are green. Even terms like integration testing or system testing although technically appropriate don’t quite have the simple, singular goal of the Toast Test.
Just: try using the product to do the one main thing it’s supposed to do – like a user would.