Jennifer Bedell wrote recently at PMStudent about testers goldplating projects. Definitely an interesting read. And pretty hot discussion under the post too.
I’d say the post is written in defense of scope. Jennifer points that testers should test against the specification only because when they step out of the specs it becomes goldplating. In other words quality assurance should verify only whether a product is consistent with requirements.
That’s utterly wrong.
It might work if we worked with complete and definite specifications. Unfortunately we don’t. And I mean we don’t ever work one of these. They just don’t exist. What we work with is incomplete vague and ambiguous.
And yes, you can deliver software which sticks perfectly to specs and at the same time is unusable piece of crap.
You should actually encourage testers to get out of the box (specifications) and wander free through the application using every weirdest scenario they can think of. Why? Because your beloved users won’t follow specs. They won’t read it. They won’t even care whether there is something called with the name. This is artificial artifact created to make discussion between product manager and developers easier. Users couldn’t care less about specs.
This means they don’t use application in a way which was specified in requirements, user stories or whatever you happen to create. They use the app using all kinds of approaches, even the weirdest ones. You don’t want to face it unprepared. So forget about damn specs when testing and try everything you can think of (and some of that you can’t).
I’ll go even further: forget about test cases too if you happen to have them. OK, forget about them during the first iteration of tests, because you have more than one, don’t you? The reason is exactly the same as above.
And yes, I can imagine some extreme bugs which will be submitted because of this approach. Bugs which would take years to fix and they are so unlikely to appear in real life it would be plain stupid to fix them. This only means you should monitor incoming stream of bugs, hand-pick these few you aren’t going to fix and throw them out.
This isn’t a reason to consciously decrease quality by sticking to specs during testing.