Forum Discussion

Darshana's avatar
Darshana
Occasional Contributor
10 years ago

How to use testcomplete in Agile

Hi

 

I am new to testcomplete.I have started to use it with functional testing for deskstop application.

 

But I want ot use it in Agile project.

 

Also I want o use it to generate test script before develoment started. I am not sure How I can access the object before develepoment of the project?

 

your help will be appreciated.

 

Thanks

 

Darshana

  • vajindarladdad's avatar
    vajindarladdad
    Frequent Contributor

    Hi Darshana,

    Copied from http://www.softwaretestinghelp.com/10-tips-you-should-read-before-automating-your-testing-work/  :manwink:

    Automate your testing work when your GUI is almost frozen but you have lot of frequently functional changes.

     

    Writting automation test when the UI is not stable is not a good idea. You will be totally unaware of the control properties which you can use in automation script.

     

    We have faced a similar situtation . We use keep lag of 1 sprint to complete our automation work.

    Ex : 

    Sprint 1 :

                   Dev creates new UI part & new functionality

                   Tester prepares test scenarios to be automated

    Sprint 2:

                  Dev works on new task

                  Tester automates what has been developed my dev in previous sprint.

     

    I hope this has been of some help.

     

     

    • Darshana's avatar
      Darshana
      Occasional Contributor

      Thanks  Vajinderladdad,

       

      I would like to know that ,how actually the automation has applied on agile projects?

       

      Is the same regreassion,load and performace scripts are used?

       

      or actually scripts for unit test(developed by programmer) is used for automation of the functional testing in testing?

       

      whol develop the automation scripts for agile projects?devloper or tester?

       

      Thanks for your guidance.

       

      Regards

      Darshana

       

       

       

      • vajindarladdad's avatar
        vajindarladdad
        Frequent Contributor

        Hi Darshana,

        We were creating the regression test and adding it to our regression suit

        In addition to that we were creating API tests for the same which does not require any UI.

         

        Coming back to your questions:

        Q) How actually the automation has applied on agile projects?

        A) First , you need to understand what new is being developed in the currect sprint , then you need to analyse which test would suit best to check the functionality (API , UI based test etc. )& then develop the test.

         

        I think creating API Test would suit more for your requirement.

         

        Q) Is the same regreassion,load and performace scripts are used?

        A) Yes , You can use the same suit but minimal modification may be required.

  • Been awhile since I was on an Agile project. But, when I was we would work with the developers (and user community) to determine which tasks (defined by our task cards) that would be in the next interation/sprint. From there we would start creating our autonated test scripts making changes as needed as we went. One code was ready to test, we would execute the test scripts. We would continue this process for each interation/sprint while at the same time building a regression testing suite we would execute after last build for that particauler User Story. Hope this helps.

  • Ryan_Moran's avatar
    Ryan_Moran
    Valued Contributor

    I am facing a similar issue in my current position of how to work in test automation in an agile environment.

    The primary concerns I have are reusability and where to focus our efforts.

     

    I have for some time been automating our test cases just for the sake of getting them done and appeasing my coworkers who are desperate to justify their necessity - but where does the meat go? I have yet to see any value in this type of automation. In fact I find writing test cases to be a complete waste of time and added fluff to the QA process. Writing out scenarios and breaking down requirements into meaningful action items to test I have found to be pretty useful but writing out all my steps for everything I test...waste of time.

     

    Just as I drive to work every day I do not need to write the steps and document every turn I take to arrive at my destination. I do it quite well on my own - naturally - because I have done it for some time now and I am familiar with the steps involved to start my vehicle and arrive at work. Sure I occasionally look for alternate routes - or maybe even drive somewhere besides work - but the maps do not apply to subsequent trips and therefore do not see a purpose to keep them around.

    I know how to drive my car - I know how to use google maps - I'm prepared for whatever lies ahead.

     

    So automation can jump through a bunch of steps with different data sets, but so can I - and I learn more - catch more - do it faster without having it automated or wasting time on the fluffy redundant documentation of that process.

    So I have typically been leaning towards Automation being valuable only in regression testing.

     

    What do you all think about test cases and can you elaborate on the greatest value that automation has in the agile environment?

    • tristaanogre's avatar
      tristaanogre
      Esteemed Contributor

      One thing of note on the "waste of time" part of documenting test cases.  Let's use your example of driving to work.

       

      For you, it's natural.  Heck, when I drive to work, sometimes I kind of go, mentally, on auto-pilot.  I know where all the turns are, where all the exits are, etc.  I know EXACTLY what to do every time.

      HOWEVER... if I have a friend that I want to have visit me at the office and they've NEVER been here before.  Even worse, they have never been in this part of the country EVER.  They want to get to my place of employment from my home town, I need to communicate how to get here.  And, for that matter, I'd better be pretty clear on the directions because in the 90 minute drive I have, there are a LOT of turns, exits, roads, etc.  Not to mention contingencies of "if it's snowing, forget that route and take this alternate route".  Somehow, that communication needs to happen.  Even if we go techno-modern and tell them "just use the GPS, here's the address"...  well, SOMEONE needed to input all that data into the GPS system in the first place, right? 

       

      Consider test plans, analysis, and so on from this perspective.  They are the means of communicating to someone else how to do what you just did.  The possibility exists for you to get hit by a bus tonight and SOMEONE needs to know a) what you've already tested and b) what still needs to be tested and c) how to go about doing so because that SOMEONE has to ramp up pretty quickly.  Even exploratory testing requires some sort of documentation to this effect.  If I've spent 5 hours testing an application and someone comes up and says, "How far are you from being done?" what do you respond with?  If you don't have a measurement point to say, "Well, out of 40 requirements, I've fully tested 29, partially tested 10, and have one yet to address", you're basically going to answer with "Um... I dunno".

      In Agile, also, there's this idea of "the definition of done".  You need some sort of metric to decide when done means done.  If there's no analysis or measurement, then you can't define done accurately.  Again, the level of documentation may be different from project to project or environment to environment, but somehow the definition  needs to be made.

      Let's go one step further..  back to driving to work.   I get a call after half an hour, "Dude, I'm like TOTALLY lost.  There was an accident on the route and I got detoured and have NO idea where to go from here."  My answer is "What is the last road that you were on in the directions and where did you go from there?"  If my friend doesn't have that information, I'm just as lost as they are and have no real way of getting them to my office.

       

      Test plans, test cases, test scenarios, test scripts provide that necessary documentation to evaluate "OK, something unexpected happened.  How did we get here so we can figure out what went wrong?"  Again, exploratory testing SHOULD include documenting what you did.  If there's no documentation, how do you report the reproduction steps of the bug?  

       

      Too often, Agile testing becomes "ad-hoc"... just do it, do it fast, and get it done.  I find that to be very dangerous.  I do think that sometimes the amount of documentation that is done as part of the testing process is excessive.  But I think throwing out the documentation part of the process entirely is just as dangerous. 

      One more thing: with software increasingly becoming integrated into all levels of society, there are legal implications of what happens when software goes wrong.  In those legal implications, there is regulation... PCI, Sarbanes-Oxley, HIPPA, etc. are just some examples.  With those becomes a necessity of being able to answer an audit. Some regulator comes by and says, "Show me how you are HIPPA compliant in your storage and access of patient records."  If you have no documentation of how your tests verified that compliance... well... ....

      • Marsha_R's avatar
        Marsha_R
        Moderator

        We don't document our exploratory testing much.  We have a high level test plan that's developed by the group and attached to the JIRA ticket that has the story/task.  The tests in TestComplete have enough comments and log entries in them that a user familiar with the software would understand them.  The ticket also contains links to a log of a successful test run and a link to the test in svn.  

         

        We don't expect that a brand new person would be able to pick up where an experienced person left off.  I started here when everything was documented step by step in Excel sheets and it really wasn't that much help until I had actual user training on the software.