Doubts about "think time" interpretations
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Doubts about "think time" interpretations
Hi everybody,
we have a question regarding the correct interpretation of the think time in the reports we are receiving after the end of a LoadNinja testing session.
We performed a test in order to stress a single page of our application; in the Loadninja recorder script we only load the page including some validations just to be sure that te page is properly loaded.
From the script recording, we see that we have a think time of around 2 seconds and we selected the "recorder think time option" for our test.
On the reports that we attach here as example, we noticed that the avergage think time is around 8 seconds.
Why is there this increase?
If we load the same page on our browser while the LoadNinja test is running, we are experiencing a load time which is totally in line with the "Total nav timings (s)" reported in the final charts.
For the sake of our performance test (single page load), should we consider the "total nav timing (s)" and ignore the think time? Or do we have to consider the "think time" as a relevant performance indicator?
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @fdelmoro
Hope you are well
Are you sure think time is a total a 2 seconds per step.
there could multiple instances of Think time per step.
This could lead me to believe there is differences with what you see in the reports.
That being said you can also change the Think time in the playback to minimum.
At any instance where think time needs to be used LoadNinja will use the minimum amount.
For your last question, using a minimum think time makes sense, especially if you need to benchmark your SPA under test and then run different load tests against it.
Here is some nice info on Think time.
https://loadninja.com/resources/articles/performance-testing/3-common-load-testing-mistakes/
https://support.smartbear.com/loadninja/docs/scripts/settings.html
KR
Vinnie
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, thanks for the kind answer.
Your answer seems reasonable, but could you please confirm us that the "think time" duration is not influenced by the page loading time?
In other words: the time that a virtual user waits for the page to load is not included in the think time reported in final statistic summary?
Thanks in advance
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @fdelmoro
From a rendering point of view it woudnt.
But it does affect how long a UI test executes.
To save confusion and doubts we provide think time values in the reports.
https://support.smartbear.com/loadninja/docs/results/navigation-timings.html
Kind Regards
Vinnie
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello, we got this feedback from the LoadNinja customer care:
"This corresponds to the definition of the “Think time” metric - the average time virtual users spent simulating pauses between user actions during the playback. Anyway, I think that using the “Think time” metric is not the right way to measure if page load times are within the expected boundaries. My recommendation would be to use SLA validations instead – they will be throwing errors if a step takes longer than expected and you’ll have an opportunity to measure the number of such errors".
This answer seems to agree with us that "Think time” metric is not the right way to measure page load times performances", but it suggests to use the threshold SLA validation.
From our direct experience the SLA threshold validation takes into account the think time, can you please confirm this hypotesis?
If this is true, do you confirm the answer given above? (they suggested not to use think time as relevant metric but suggest a threshold validation that takes into account the think time)
