Tales of Testing
  • BLOG
  • Tea-time with Testers
  • Press, Publications & Talks
  • Editor's Choice
  • About
    • Testimonials
    • Photos
​​  Testing, Leadership, Software and more...
Publications & Talks
About me

Vocabulary and Perceptions

31/10/2017

1 Comment

 
"I am software tester. My job is to break software." , said one student in my Exploratory Testing workshop. I asked him to elaborate and explain me his techniques to break the software. He was silent for moment and then said he did not know how exactly to answer that. 

I further asked about the last defect he found and how did he find it. He could explain that to my satisfaction. Then I asked if the defect was already there or something that he did introduced it. Student realised where I was coming from and admitted that he did not break software, he only helped to uncover the software which was already broken. Then I asked him once again to explain his techniques to uncover those already broken points and he explained that to my satisfaction again without getting frozen in between. 

One of the important lessons I have learned from James Bach and something that I make sure to propagate in my discussions with testers is that, we (testers) need to be careful of the vocabulary we use to describe our work because it indeed makes big difference.  With this little change, I have experienced the change in me in terms of how I perceived things before with the use of established vocabulary and after starting to choose my vocabulary carefully. That's not it, I have witnessed that when you help people to realise the same in a constructive way, they also make sure they help others to realise it. And this "chain reaction" of propagating that gesture is in my opinion, an important part of contributing to the craft. 

I came across this thought-provoking post written by Maaret Pyhäjärvi (influential colleague in testing community I respect and admire ) and some of the arguments she has made, made me ponder upon my own attempts and experiences. 

In her blog Maaret says:

Instead of changing the vocabulary, I prefer changing people's perceptions. And the people who matter are not random people on twitter, but the ones I work with, create with, every office day.

I totally support the idea of helping to change peoples' perceptions. I have made those efforts and have seen that taking effect. The approach is very much in line with Weinberg's idea of influence and that has always been my first approach towards changing something. However, the results in this particular case in my experience have been short-lived. I found people to be coming back to their initial understanding which was primarily shaped by established vocabulary and every once in a while I had to discuss the same thing with them again. Can I say my efforts were paying off? I guess not really. 

What was and is the problem? 

I realised that people that I had helped change their perception (mostly programmers and non-testers) got back to old vocabulary because other testers and stakeholders they were working with, were totally unaware of what they were talking about and why. Those people found it very frustrating to explain others their rationale behind using different vocabulary and eventually they gave up. I remember of one programmer friend coming to me and saying that he felt silly and stupid because the tester he was talking to was totally clueless of what he meant. And he finally said if testers themselves don't care about what their vocabulary should mean, why should he? And he was right! 

The problem is, testers who understand the problem with established vocabulary are very less in number as compared to an entire lot of project stakeholders who use it.  And testers who make an effort and help others to change their perception are even less.  

And this is why I personally see the problem with "living with" some established norms which need revision. I think it's a high time that we strongly disapprove of what we do not believe in. Because it badly affects all the efforts made by people who care. When we know about problem with things and still decide to live with them, our awareness about those problems becomes pointless or less effective if not. 


It is not just about people around us 

The other day I watched this humours video by AIB on mass technical recruitments where they pick two gardeners towards end of recruitment to fill their quota and say, "Let's put these two on manual testing. Who requires talent for that anyway?"

That was very difficult to digest for me but at the same time, I could not blame the producers of that video because our established vocabulary is not their problem. They just presented the widely established (and mistaken) perception of our established vocabulary. 

If we as testers don't care enough about changing something wrong just because it is established, we are letting others shape wrong perception about our profession and that is a silent killer. One of the leading testing tool company recently tried to showcase "manual testing" as outdated, bad testing and proposed their tool that supports Exploratory Testing as a solution. Major part of our industry still considers testing = manual testing = bottle neck and hence thinks of eliminating testing all together. But in reality what they want to get rid of is bad testing only and not manual testing or testing as such. If it is not us then who else is supposed to care about these problems and make efforts to solve them? And I don't know how best we can stop it other than getting rid of the labels and classification which is adding a lot to confusion. 

In my opinion, we are responsible for how we let others shape their understanding of us. And no matter how hard we try to do it with people around us, there will always be people beyond our scope who will undo what we do. If we keep collecting the karma of living with established vocabulary and do not make deliberate efforts to change it, it is most likely going to haunt us and our generations (if at all we survive).  

It is now up to us whether to collect that karma or to cleanse it. Cleansing sounds reasonable to me but I am still looking for more options . What if we do both? 

1 Comment

Interviewing Programmers for Quality Mindset

29/10/2017

1 Comment

 
Picture
Lately, I happened to have an interesting discussion with my colleague Dirk Meißner on whether programmers should have reasonable understanding of testing or not. A lot has been talked and written about how testers need to be great with their technical skills so that they can contribute effectively and remain valuable. Sure, that's helpful and I too insist that it's high time that testers get over with their traditional way of working (and thinking). However, what surprises me is that there is not enough awareness or enough discussions happening around programmers learning to understand testing to amplify their effectiveness.

Does it matter? Why? 

It absolutely does. At least now, if it did not before. "Whole Team Testing" is new cool (again) especially in DevOps contexts. And it has it's own reasons to be that way.

Let me explain. Agile teams typically have one tester dedicatedly looking after testing and related activities in team. This tester is usually busy testing (and automating often) stories for each sprint with primary focus on acceptance criteria. If the tester is "cool kid" then they go over the board and test things beyond acceptance criteria too. Cool! Let's park this thought here for a moment. Okay?

James Bach, in his interesting article "Seven Kinds of Testers“ beautifully explains the key patterns surrounding testing styles and how testers typically fit into one pattern or the other or combination of more some times (or checkout this thought-provoking tweet series by Michael Bolton).  In over eight years of hands-on experience of testing, I have found myself to be of one kind (or maximum two) at a time and by the time I wish to change my hat (or style) it is usually almost the time to deploy the feature in production. Pity!

The point is, there is limit on how much versatile a tester can be in limited period of time for each story they test. Sure, it's not impossible but I would say it's not very easy either, given the time constraints. Now, imagine that we add "programmers" from team as other kinds of tester (based on their skills, expertise and experiences) working on same story. Do you not think it will most likely add more coverage to the quality of that feature, without having to spend really additional time on it? Do you not think that testing wisdom of a programmer would help tester and the team to ship quality product? I'm sure, now you do!

When I say programmers should contribute to testing, it does not have to be only in a way that they will need to test the software like testers do. Even if they develop the required mindset, it's already a good start. Indeed it will be great if they can test it but I feel that if they could at-least understand modern testing, it will greatly benefit the project teams. 

How exactly programmers can learn to test and start off with it, or how testers can help them to onboard with testing is another topic. It requires a dedicated post (more on that later). 

This post is about identifying programmers with testing mindset or skills that can help them test better, while you interview them. I recommend watching out for these skills/traits in interview:

1. Quest for Context

The scholar John von Neumann once said, "There's no sense being exact about something if you don't even know what you're talking about." In a world that is growing increasingly dependent on highly complex, computer-based systems, the importance of defining what you want to make before making it -- that is, knowing what you're talking about -- cannot be stressed enough. (Exploring Requirements: Quality before Design by Weinberg & Gause). 

Developing software is not just about writing a program that will do the stuff but it is more about building a product that your customer would like to use. In past, I have come across programmers who were excellent coders but failed to care enough about purpose of the program they used to write. I rarely see programmers questioning the user story beyond acceptance criteria and technical implementations if any (unfortunately, it's not very different for majority of testers either).

If I were to hire a programmer, I would expect him to ask Context Revealing questions. By that, I don't mean just questioning the business value of the user-story. There are things beyond that which matter. What if we find out that other team has worked on similar solution before? What if we could re-use some components developed by other teams?  What happens when particular feature development requires specific technology expertise and team does not have it? What happens when stakeholders' understanding of technical details defer from engineering team's? What if implementation of some solution requires tools not available with team or required access levels for that matter? 

Sure, one can eventually find these things out when they start working on the ticket but what's the point in findings things with accident and when it's already late? 

Programming interviews typically include coding challenge that typically gets assessed for candidate's technical skills, familiarity with known technological issues, understanding of best programming practices, problem solving skills etc. which are indeed important. However, I am yet to see a programmer being assessed for the kind of questions they asked before jumping on to the coding challenge itself.  See if they are questioning the very purpose of challenge, see if they question the business value, check if they ask about other elements of Project Environment and Product Elements for that matter.  And, please check if they ask questions about testing and Quality Criteria if nothing else. Most of the programmers I got to interview usually assumed that that there will be a tester in team who will QA their code. They just have to write the code and throw it in tester's bucket. You better watch for such kind if your team does not have a tester. 

Just like testing, good software development should be treated like an intellectual activity. The better one understands it the more ways one can contribute to product quality. And it all begins with asking questions. 

​2. Interactional Expertise 

If you are unfamiliar with the idea of Interactional Expertise, I suggest you start from understanding it first. Even better if you could read Tacit and Explicit Knowledge by Harry Collins.  I personally found it to be very useful learning when I was introduced to it by my friend Iain McCowatt. 

​The purpose of mentioning Interactional Expertise as a skill here is that, I find it to be very important skill when it comes to have technical discussions with non-technical people. Or, even when it is about having meaningful technical discussions in short period of time. 

Bringing up technical topic for discussion in planning meetings or grooming/estimation meetings is usually like opening the Pandora's box. Over the years, I have been a part of deep discussions in meetings with only conclusion of carrying them to next meeting or scheduling separate meeting for that. Then again, special meetings for explaining those technical things to non-technical people were required. Does that not sound familiar to you? 

I feel that spending so much time on deep discussions very often is unnecessary and it can be significantly controlled if all of us (not just programmers or testers) learn the skill of explaining things in short (and to the point) when needed without losing the substance of it or compromising with the impact an elaborated version would make. Same goes with explaining technical things to non-technical people. As techies, we can't expect the whole world to understand the language we speak (it would be nice if that happens though) but we can make things simple by learning the art of explaining those to other in language and context they would understand better. 

Added advantage would be when you onboard new members in team. Regardless of what role they are hired for, person's IE skills would help them onboard much better and the skill will definitely help for better collaboration and communication. In fact, when testers and programmers both have great interactional expertise then sessions like pair testing or programming will be super productive. Imagine what value it can add for Mob Programming and Mob Testing sessions. I have worked with some programmers who were master of explaining technical things to non-technical members of team as if they were putting a child to sleep by telling a story. Short, sweet and yet satisfying. That's what I mean by Interactional Expertise. 

Next time when you interview a programmer, look for these. It will help you. When I interview testers for it, I usually ask them to explain some technical concept in 50 words for example and again same concept in 100 words. It helps me analyse how good (or bad) they are with their Interactional Expertise. Asking programmers to write a technical bug report or user story can also be helpful trick to evaluate them for their IE. 
​
​3. Understanding of Testability 

May be I am wrong about it, but I honestly feel our industry still lacks required seriousness (and awareness) for building testable products. This is not just about programmers being unaware of it but even testers. 

The only times I hear of the word "test" in programmer interviews are when they talk about their unit tests or TDD or automated tests at the max. And it's a pity!

Building testable products is an important part of software development and it is important that programmers understand how to bake testability right from the beginning. Sure, skilled testers can certainly be advocates for testability but it won't hurt if programmers too understand what it means to them and how they can contribute to it. 

While interviewing programmer, I suggest you pay special attention to their solution if that demonstrates at-least few aspects of Intrinsic Testability as explained in ​Heuristics of Software Testability by James Bach. If not, at-least make an attempt to discuss other aspects of software testability (listed in the heuristics) with candidate in general and gauge their fitness for your requirement. 

For sure, the skills mentioned above are equally important for testers too but since "Whole Team Testing" thing is picking up, I wanted to make it explicit for traditional non-testers. Next time when you interview a programmer, please try and see if it helps. 

If we need technical testers, we also need programmers who understand testing. And that is reasonable to ask for, isn't it? 
   
Oh and by the way, I will be touching on some of the related topics in my talk for Online Testing Conference. Feel free to join if the topic interests you. 
Picture
1 Comment

    Author

    A passionate & thinking tester. Trainer & student of the craft of testing. Father, Foodie and dog lover. Chief Editor and Co-founder of Tea-time with Testers  magazine. 
    Setting up this blog to share my experiences whilst doing what I do... 

    Categories

    All
    Agile Testing
    Community Of Practice
    DevOps Testing
    Editorials
    Estimations
    Exploratory Testing
    Heuristics
    Hiring
    Interviews
    Leadership
    Lean Coffee
    Life
    Meetups
    Mindmapping
    People
    People And Processes
    Philosophy
    Problem Solving
    Problems Surrounding Testing
    Project Management
    Testing And Programming
    Testing CoP
    UX And Testing
    Whole Team Quality
    Whole Team Testing
    XING

    Archives

    January 2021
    September 2020
    July 2020
    October 2018
    May 2018
    October 2017
    June 2017
    May 2017
    April 2017
    December 2016
    September 2016
    August 2016
    June 2016
    January 2015
    September 2014

    FREE subscription

    Picture
    Tea-time with Testers
    Your Email:

    RSS Feed

© Copyright | Lalitkumar Bhamare | All rights Reserved.
Contact  About  Home
  • BLOG
  • Tea-time with Testers
  • Press, Publications & Talks
  • Editor's Choice
  • About
    • Testimonials
    • Photos