Forum Discussion

SuperTester's avatar
SuperTester
Contributor
4 years ago

Regression Testing - How to Approach

Hey guys,

 

I got a funny question and I wanted to hear some opinions on.

 

How do you determine what should be tested in regression test scripts?

 

Should regression tests validate bug fixes? Should they test extreme case or hypothetical scenarios?

 

Thanks!

3 Replies

  • tristaanogre's avatar
    tristaanogre
    Esteemed Contributor

    Regression tests, by definition, make sure that what worked in a previous version are not "broken" in the next release candidate.  To me, this means that you're doing functional testing based upon feature set and requirements.  Technically speaking, this should also cover any previous bugs that were found and fixed because, if they were fixed properly, then the test cases that found them originally would pass.

     

    When I did automated regression testing at a prior company, we did not create specific test cases to re-test the bugs.  We adjusted existing test cases to make sure that they included the features that needed to be tested.  The whole reason why a bug happened in the first place was that the previous test cases were insufficient in their code coverage.  So, we simply adjusted those test cases to make sure that they covered the necessary code.  This way we kept the number of test cases to a manageable amount but increased our regression code coverage.

  • AlexKaras's avatar
    AlexKaras
    Champion Level 3

    Hi,

     

    Initial assumption: we are talking about automated functional (end-to-end) regression tests.

     

    By its definition, regression test verifies that some given version of the tested application behaves exactly like the previous one. And nothing more. Point.

     

    From the above definition it follows that:

    a) Test that is used for regression must pass if the examined version of the tested application behaves exactly like the previous one and fail otherwise. Note that the criterion is 'the same behavior' but not the 'correct behavior' (or the 'expected behavior' or the 'behaves as required');

    b) In the process of its creation, regression test must match the current behavior of the tested application. Again, the current behavior may be not absolutely correct, may contain some problems and/or do not correspond to requirements, but the test must pass for this version of the tested application. Obviously, if the tested application contains critical blockers or something that functions incorrectly and does not have a workaround, these parts of the tested application should not be included into regression set (just because this functionality should not be delivered to production) and found problems must be reported during the process of test creation. (Or earlier, during manual testing.)

     

    When executed, regression test must not fail and stop because of known problems. Even better if those known problems are not reported as errors in order not to congest the log with, actually, false negatives and make it easier for you not to miss the actual change of the behavior. (Which is the goal of the regression testing.)

    It is up to you how to implement the above requirement. One of the most intelligent approaches that I have met was designed like this:

    -- When test failed for the first time because of the error in the tested application (or when the problem was already known at the moment of test creation), the reference to the corresponding case in the issue tracking system was inserted in test code;

    -- If the workaround existed, it was programmed into test code to make it possible for the test to proceed;

    -- During subsequent test runs, test code requested issue tracking system for the state of the problem. If the problem was still not resolved, test code followed workaround path. If the problem was resolved, then test code followed primary expected path. If some problem occurred on any of these paths, it was reported as an error to the test log and could be indicating either regression or some new issue.

     

     

    Update: Funny, we wrote practically the same with Robert 🙂

     

    • sonya_m's avatar
      sonya_m
      SmartBear Alumni (Retired)

      SuperTester what a great conversation idea, thank you! 

      Thanks Alex, Robert!