Forum Discussion

mtan2014's avatar
mtan2014
Occasional Contributor
12 years ago

Load UI summary

Hi,


I ran a distributed load test and from the summary it wasn't showing correct numbers. The test contained 3 scenarios. A scenario contains a generator (30/s), a soap ui runner and 3 logs connected to the soap runner.

On the summary report, it said the total number of request was only 164 with 159 assertion failures but from the designer it shows a total in the 60ks with 978 failures. When opening the scenario, only 1 had some numbers shown on the soap ui runner for each test step and the rest didn't. Also, the total count on the runner does not match the total count for the scenarios. All the log tables were empty. The graph in the statistics is showing the number in the 60ks as well but the Summary report does not match. Why is this?

6 Replies

  • mtan2014's avatar
    mtan2014
    Occasional Contributor
    Also, in the summary report in statistics tab, what is the meaning of assertion failures and the status pass? The assertions were put in the soap ui test cases for each step. Is this where it is looking at? Also, the total number of assertions and number of assertion fails are the same. Does this mean that all request failed? One of the tests we ran had a total request of 211080 with 2090 assertion failures. What is the status of the difference? Are they pass? If so, how does it know it passed/what does it look at?

    With the generator, we put 240/s but only got around 120/s for the full tests, why doesn't it get the full rate?
  • mtan2014's avatar
    mtan2014
    Occasional Contributor
    With the log tables, why are they not showing any data when running distribution testing using agents?
  • Hello,
    Ill try to address your problems one at the time below.

      1. Inconsistent numbers in summary report
      The inconsistent numbers you are seeing might be due to requests being discarded. If they are they should show up in the discarded counter in the Runner. I would like you to send me screenshots of the numbers you are talking about so i can fully understand the problems you are having.

      2. Table Log components being empty
      The logs being empty might indicate that the requests failed, have you tried connecting a log to the fail output(left most terminal)?

      3. Setting the generator to 240/s but only getting 120/s in the test.
      This is a strong indicator that the requests are being discarded.

      This could mean one of three things

        1 - Your machine is not being able to generate all the requests.
        Symptom: High LoadUI CPU usage.
        Solution: distribute the test to agents to be able to generate the amount of user you need.

        2 - The tested server cant handle more than a arrival rate of 120/s
        Symptom: The server is starting to take longer and longer time to respond(and eventually not responding at all)
        Solution: no solution here, You have found the limit of what the server can handle!

        3 - You are reaching more than 100 concurrent users
        Symptom: If the running counter in the is reaching 100 without the response time of the server going up.
        Solution: Raise the "Max concurrent requests" value in the Advanced tab in the SoapUI settings.


      4. The Table log is not logging anything when running a distributed test.
      To enable this just check the enabled in distributed mode check box in the table log. NOTICE: this causes each entry being sent individually over your network. In large tests with high rates this might cause a big strain on your computer and network. We advice you to only use this feature for small tests or debugging.

      5. Assertion failures.
      Yes, from your description it sounds like all your requests failed. If they failed as assertion errors it means the assertion in SoapUI failed.


    Did i cover everyhing? Any questions?

    Regards,
    Max
    LoadUI Developer
  • mtan2014's avatar
    mtan2014
    Occasional Contributor
    Hi Max,


    1. Please see attached screenshot where numbers arent matching. Scenario 2 and did not have any counts for each step.
    2. All runner had table logs attached but I missed the enabled in distributed mode so that was probably why it didnt show.
    3. How do we know if it is case 2? The tested server was being monitored for processor time but it was mostly in the 50 %. It did reach 100 % a few times. It was being run with 10 agents and the max concurrent users and queues were increased prior to testing.
    4. Thanks. I had missed this part.
    5. Do you mean all 60k requests failed (apart from the assertion fails from soap ui)? Why would this be?


    Thanks.
  • mtan2014's avatar
    mtan2014
    Occasional Contributor
    Is there a way to find out the requests sent by each agent and ones actually received? In the statistics tab under agents, there are graph for each agent but they are empty.
  • Hi mtan2014.

    We have taken this issue in our internal support system and will be answering to your queries there.

    Best regards,

    Renato
    SmartBear Software