Testing with Questions

Testing is about questioning the software and using that information from the answer we receive to inform our own decisions and those of our stakeholders. Whether we write all our tests out in scripts before we start execution or we explore the software as we go, everything we do will resort to asking a question that we are looking to have answered. I have always felt that when I am testing something I am always looking to learn something new that I did not already know, and by having the fact that my tests are questions at the forefront of my mind it helps me to make the most efficient use of my time while testing.

My definition for testing is “a set of questions used as a means of evaluating the capabilities, effectiveness, and adherence of software”.

I find however that when I speak with other testers about this they either look at me like I am going mad or they say something like “oh, never thought of it that way” which always surprises me somewhat. I find it difficult to see how people who spend their days thinking about, documenting, and executing tests do not make the association that each of their tests is asking a question.

I have also seen that where testers do not make this association they falling into some traps. One of the most common traps I see is where they repeatedly ask the same question over and over again. If a test is not telling you something new then it is not the best use of your time running it. Therefore when they are repeatedly asking the same question they are not learning anything new about the software and given that all testing phases are hamstrung by time this means that potential areas of the system remain a mystery (normally until a production user gets hold of it).

Another thing that I notice is when testers devise tests and have not considered the question they are asking. They lack an understanding of why they want to know the answer that the test is going to give them. When you consider your test a question it forces you to think about not only what you are going to do, but it also why you want to know, and what you are going to do with the information you get from it.

There are 3 main categories of tests that I ask when testing, and they are:

  1. Verifying questions
  2. Investigative questions
  3. Clarifying questions

Verifying questions are those where I am looking to prove or disprove the truth of a known expectation.  These types of questions are primarily those that relate to requirements testing.  That is where you have a requirement of what the software must/must not be able to do and tests that you run will look to specifically answer these questions.

Investigative questions are those where I am examining the software in an attempt to learn something hidden or complex. This is what I see as exploratory testing. Where I don’t have a specific requirement that I am looking to verify, but more where I have a hunch or I am inquisitive to know what will happen under a specific condition. An investigative question I find is often born out of other questions, such as the answer to a verifying question that was unexpected and leads me to think of other potential issues, or scenarios based on this new information.

Finally, Clarifying questions are where I want to challenge an answer that I have already been given by the software and I am either suspicions of or more likely just want to make sure that I fully understand the answer that it has given me. Now I know I said above that asking the same question over and over again is a waste of a testers time, and I stand by that, that is more in relation to the tester not realising they are asking the same question. A clarifying question is where I specifically know that I am asking the same question (maybe in a slightly different way) to ensure that I understand what the current information that I have which is very different.

Leave a Reply

Your email address will not be published. Required fields are marked *