Thursday 24 May 2018

Discussing Tester's Value, Beyond the Echo Chamber

I've finally come to a conclusion. I spend a lot of time thinking about why I feel like a lot of the advice to testers feels common sense, and why I don't feel like my thoughts are novel or unique. Here's the problem: the testing community is in some senses an echo chamber. The same thoughts get shared within the community, and people hear, agree and repeat. What I've personally neglected to do is reach outside the tight testing community to listen, learn and share. What do people in other engineering roles think the value of testers is? Or maybe they think there's no value, and I should seek to understand why. Sharing my thoughts on the value of the tester role and engaging in discussions when opposition is raised is only going to serve to improve bring clarity to my points.

Here's a couple things I've noticed in discussions with people in non-testing roles who think testing isn't valuable:


"Testers don't understand code. They rarely have technical understanding of how software works."
Admittedly, this one pisses me off more than most. Understanding how software works and understanding code don't go hand in hand. And being able to understand what a snippet of code is doing doesn't necessarily involve knowing how to code in that specific language either. Having said that, I've seen many organizations add fuel to this fire by hiring large numbers into testing roles who truly do not understand software. Like at all. We're talking applications-as-a-black-box, tell-me-what-to-click-to-test mentality. Check the box, move on. I've got plenty of tips and tricks I'll share with you (in person, over social media, or in a later blog post) if you are a software tester who is afraid of code, or doesn't know how to grow beyond the role of a pure blackbox tester. Having a blackbox hat on your rack is a great skill...but don't let it be your only skill.


"Testing is slow. We should automate it all so it's fast and we won't need human testers."
Well we're actually partially agreeing here. We should automate regression coverage so that devs can be sure their feature continues to work as expected going forward, without relying on testers having to revisit the feature and all valuable scenarios after every change. That would be slow, tedious, and not a good use of anyone's time. But if we can automate all that away, it's a good use of skilled testers time to explore the feature - exercise workflows as a user would, look for edge conditions and risky areas, think about the feel of the feature, consider performance, accessibility, security....
Good testers don't want to do the boring, repetitive shit. So let's not make them.


"Testers are only needed to execute testing steps. Therefore they're a cheap safety net for organizations."
Have I ever mentioned how much I hate seeing testers referred to as "cheap"? I read a response to a LinkedIn post the other day, and someone said: they don't want their developers responsible for doing any testing because "why pay a skilled developer $120/hr to test their feature when a tester could do it for less than half that"...or something to that effect. Wow. 
This is akin to the whole "anyone can test" or "testing isn't a real skill" types of arguments. You also hear a lot of people repeating things like "this is why companies like Google or Facebook don't have testers". (Newsflash by the way...they DO have testers, despite publishing all kinds of things saying they don't. Is it just cool to say you're hip and modern and don't have testers?)
I think this argument is perpetuated by people that have only come into contact with bad testers, or been at organizations that foster poor testing practices. Testing requires skills in functional decomposition, risk assessment, specific types of communication and technical writing. But we as testers need to prove that by fighting back against bad testing.

No comments:

Post a Comment