New Idea

I would like an option to exclude a particular page from being included in the Top 10 > Slow Pages (Average) section.

 

Use Case:

I have a test with 10 virtual users, each signing into an app and doing different things. The problem I am facing is that the sign in page (or landing page after user signs in) is one of the slow pages and after a load test, 6-8 entries in the Slow Pages (Average) list will be filled with the sign in page data from different users and the Top 10 list would look like this:

  1. Scenario1: Page 0003 (Action XXXX)
  2. Scenario3: Page 0002 (Sign in)
  3. Scenario2: Page 0002 (Sign in)
  4. Scenario5: Page 0002 (Sign in)
  5. Scenario4: Page 0002 (Sign in)
  6. Scenario9: Page 0002 (Sign in)
  7. Scenario6: Page 0002 (Sign in)
  8. Scenario8: Page 0008 (Action XXXX)
  9. Scenario10: Page 0002 (Sign in)
  10. Scenario7: Page 0002 (Sign in)

Because of this, I think I am missing information related to other slow pages.

 

I would like to either:

  • Show "Sign in" page only once (the one that took the longest) in the list

OR:

  • After I see the above scenario, be able to exclude "Sign in" from the Top 10 list in subsequent test iterations.

 

Also from @AlexKaras:

"I think that it must be considered how to clearly indicate that the given report lists not Top 10 Slow Pages, but Top 10 Slow *Filtered* Pages, so that it is clear to report viewer that there are even more slow pages than those reported but for some reason these pages (provide names?) were filtered out."

I would like LoadComplete to support integration with versioning systems (like SVN, other systems might be welcome as well). Just like TestComplete has support for SVN.

 

Please vote if you would love to see this implemented as well.

 

 

 

Link Offset values to Virtual User Groups

Status: New Idea
by ‎12-15-2017 07:38 AM - edited ‎12-15-2017 07:38 AM

Currently you can only set 1 offset value for the complete test.

This is not sufficient because I need to use different Offset values for different Virtual User Groups (to be able to simulate slow and fast working users) (this is because our application uses a timestamp in the users' response body to calculate how fast/slow a user is working with the application).

 

I need to be able to set different offset values, and link this to Virtual User Groups. Somehow it would need to be possible to link Offset values to Virtual User Groups, to simulate "fast" users and "slow working" users.

 

2017-12-14_13-58-27.png

Include used variables (and their values) in Load Test Report

Status: New Idea
by ‎12-14-2017 06:47 AM - edited ‎12-14-2017 06:47 AM

Use case:

My load tests use lots of variables. These variables often influence test circumstances, and I want to have the variables used mentioned in the Load Test Report. A good way (I think) would be to implement a new tab named Variables and create a list of used variables and their values in that section.

 

See below for example:

 

2017-12-14_15-40-07.png

 

Note: also make this info available in the Exports (PDF, HTML)

Split up offset values for Time

Status: New Idea
by on ‎12-14-2017 05:26 AM

Right now you can only set a value for hour (or parts of hour) for Offset values.

 

For example for a 1-minute offset you have set (1/60th parts of 1) - 1 has the value of 1 hour.

For more fractional parts you have define even smaller parts. 

Also you cannot set offset values for year/month/days

 

Please improve this by implementing offset values for year/months/days etc in seperate input fields.

 

2017-12-14_14-19-23.png

include image ID in error message

Status: New Idea
by on ‎12-07-2017 02:47 AM

On a regular basis, Amazon images instantiated by LoadComplete crash.

After test this is error message is displayed.

 

For better analysis could the instance ID be displayed in the error message?

 

2017-12-07_11-41-53.png2017-12-07_11-23-56.png

Currently, in the Test Editor, you can only select out of a few pre-defined Connection Speed settings. Better simulation of connection speeds would be possible if this could be updated to specifying specific Upload/ Download bandwith settings (in Kb/s, MB/s, etc).

 

E.g.

Download speed: 40 MB/s

Upload speed: 10 MB/s

 

2017-08-02_14-53-31.png

Use case:

We run our tests on custom build Amazon Images.

 

However, to maintain the overview, Tags are used to identify images (also used for cost / management).

 

Please improve LoadComplete so that Tags (keys and values) can be given in LoadComplete, and these values are being set when the Amazon Image is created.

 

2017-11-16_15-18-51.png2017-11-16_15-22-58.png

Provide option to Copy (and Paste) Scenarios

Status: Community Feedback Requested
by ‎11-16-2017 04:54 AM - edited ‎11-16-2017 04:58 AM

Provide option to Copy (and Paste) Scenarios.

 

Selecting Copy (option to be added in the context menu) copies selected test to memory.

 

Pasting them (option to be added in the context menu), pastes them : [samename]_02 (or something like that).

 2017-11-16_13-49-03.png

 

Status: Community Feedback Requested

Hi,

 

There's a workaround in the latest version of LoadComplete (ver. 4.80). You can create a new scenario and then copy-paste all operations from the source scenario.

Provide option to Copy (and Paste) Tests.

 

Selecting Copy (option to be added in the context menu) copies selected test to memory.

 

Pasting them (option to be added in the context menu), pastes them : [samename]_02 (or something like that).

 

2017-11-16_13-49-46.png

Currently, you can only check on equalness, contain or does not contain.

 

But I would like to be able to check if a value is between specific values (eg. > 0 , < 1000).

 

Could this be implemented in the Operations section.

 

2017-11-10_10-20-22.png

In the generated Report all sort of values are calculated and displayed. For example Page Load Time, Scenario Completion Time etc. and these values are displayed at intervals of 5 seconds. 

But I miss overall averages (of course these can be calculated by copy/pasting the tables to Excel), but would be better if these are mentioned in the reports as well:

 

see attached example:

2017-11-08_16-10-44.png

We run our tests on Amazon Images. Any info about these images is lacking on the test report.

 

In the test report, we would like to see image specifics like image type as defined in the image definition:

 

2017-11-07_16-24-59.png

 

Also IP addresses as they are generated when the test is initiated (see below):

2017-11-07_16-18-02.png

 

Please include all above info in the test log (for example on the Infrastructure tab (or a new tab).

 

So a more resourceful Test Report can be generated.

 

 

We run our load tests on Amazon Images. We would like to monitor server metrics (CPU, memory, network etc) on these images and include these in the logs.

 

Currently this is (as far as I know of), not supported by LoadComplete.

 

Please include this feature in upcoming version of LoadComplete.

 

2017-11-07_16-10-12.png

Right now, it's hard to get an overview on which Requests have Data Extractors and Data Validators in them, within a specific scenario.

 

Could LoadComplete be improved so that markers could be placed on the requests which have Data Selectors and/or Validators in them.

 

See below screenshot:

2017-10-31_13-13-22.png2017-10-31_13-07-19.png

Having created multiple test items, I noticed that the test conditions (load profile/continuous load settings/think time settings/QoS etc are not on the level of specific test items, but are on the global level.... (for all test items).


Is this logical? I mean I can think of an approach you want different load profiles for different test items.
See attached screenshot.


I have created 2 test items, but I can only set the settings on the general level...

 

2017-06-14_11-01-59.png

 

I want LC to be improved so that I can set different load scenario's for different test items. Same counts for Think Time as illustrated in the screenshot.

 

I know it's possible to create different tests and have different Load scenarios for them but I am specifically pointing the fact on running tests in parallel.

 

 

I run the test on 4 connected stations. (using 4 instances of LC's Remote Agent). 

In the report I can't make any distinction on response speed/transfer speed/ TTFB/TTLB on specific machine. I am not reffering to server metrics like CPU, but more on connection info, and graphs which can make distinctions between the tested machines (Remote Agents).

 

For example I want to see difference between average Page Load time for machine 1 compared to machine 2. 

Another example: I want to compare values of average TTFB of machine 1 and average TTFB on machine 3,

etc..

 

 

2017-06-13_17-21-12.png

Currently, when you disable the option to launch a specific browser, you cannot select option "record traffic from this browser only" . (it's only enabled if you enable the launch option).

 

Consider this scenario: I don't want LC to start a browser, however I only want to record data from browser Chrome.

 

In current setup, this is not possible. So, I want LC to be updated to support this scenario.

 

2017-06-13_12-01-12.png

Please improve LoadComplete so that the tester can add notes to the test log so he/she can set (personal) notes (like test environment specifics), because a lot of relevant test log specific data is not set in the log.

 

See attached screenshot for illustration:

 

2017-10-27_13-37-13.png

Please could LoadComplete be improved so that project variables can be defined for hosts.

 

So you dont have to change host on each request but on project level.