LoadComplete

New Idea

include image ID in error message

Status: New Idea
by on ‎12-07-2017 02:47 AM

On a regular basis, Amazon images instantiated by LoadComplete crash.

After test this is error message is displayed.

 

For better analysis could the instance ID be displayed in the error message?

 

2017-12-07_11-41-53.png2017-12-07_11-23-56.png

Use case:

We run our tests on custom build Amazon Images.

 

However, to maintain the overview, Tags are used to identify images (also used for cost / management).

 

Please improve LoadComplete so that Tags (keys and values) can be given in LoadComplete, and these values are being set when the Amazon Image is created.

 

2017-11-16_15-18-51.png2017-11-16_15-22-58.png

Provide option to Copy (and Paste) Tests.

 

Selecting Copy (option to be added in the context menu) copies selected test to memory.

 

Pasting them (option to be added in the context menu), pastes them : [samename]_02 (or something like that).

 

2017-11-16_13-49-46.png

Provide option to Copy (and Paste) Scenarios

Status: Accepted for Discussion
by ‎11-16-2017 04:54 AM - edited ‎11-16-2017 04:58 AM

Provide option to Copy (and Paste) Scenarios.

 

Selecting Copy (option to be added in the context menu) copies selected test to memory.

 

Pasting them (option to be added in the context menu), pastes them : [samename]_02 (or something like that).

 2017-11-16_13-49-03.png

 

Currently, you can only check on equalness, contain or does not contain.

 

But I would like to be able to check if a value is between specific values (eg. > 0 , < 1000).

 

Could this be implemented in the Operations section.

 

2017-11-10_10-20-22.png

In the generated Report all sort of values are calculated and displayed. For example Page Load Time, Scenario Completion Time etc. and these values are displayed at intervals of 5 seconds. 

But I miss overall averages (of course these can be calculated by copy/pasting the tables to Excel), but would be better if these are mentioned in the reports as well:

 

see attached example:

2017-11-08_16-10-44.png

We run our tests on Amazon Images. Any info about these images is lacking on the test report.

 

In the test report, we would like to see image specifics like image type as defined in the image definition:

 

2017-11-07_16-24-59.png

 

Also IP addresses as they are generated when the test is initiated (see below):

2017-11-07_16-18-02.png

 

Please include all above info in the test log (for example on the Infrastructure tab (or a new tab).

 

So a more resourceful Test Report can be generated.

 

 

We run our load tests on Amazon Images. We would like to monitor server metrics (CPU, memory, network etc) on these images and include these in the logs.

 

Currently this is (as far as I know of), not supported by LoadComplete.

 

Please include this feature in upcoming version of LoadComplete.

 

2017-11-07_16-10-12.png

Right now, it's hard to get an overview on which Requests have Data Extractors and Data Validators in them, within a specific scenario.

 

Could LoadComplete be improved so that markers could be placed on the requests which have Data Selectors and/or Validators in them.

 

See below screenshot:

2017-10-31_13-13-22.png2017-10-31_13-07-19.png

Please improve LoadComplete so that the tester can add notes to the test log so he/she can set (personal) notes (like test environment specifics), because a lot of relevant test log specific data is not set in the log.

 

See attached screenshot for illustration:

 

2017-10-27_13-37-13.png

Please could LoadComplete be improved so that project variables can be defined for hosts.

 

So you dont have to change host on each request but on project level.

 

 

Use case:

 

I run a load test of 1 hour of 75 users on 10 Amazon Images. After the test finishes it takes 20 minutes to receive the log. In the mean time I see this progress bar processing very slowly. Could the performance of the process (uploading the logfile) be improved? I think having to wait 20 mins for a load test of 1 hour could/should be improved. 

 

Note: Bandwidth of our office network/Amazon could not be the issue, so this can be excluded.

 

2017-10-26_14-49-48.png

Similar to this request, I would have LoadComplete to be able to use project variables as values for Page Think Times.

 

Use case: with this implementation you can run the test at different speeds (for specific pages!) easily.

 

See below screenshot:

 

2017-10-26_10-51-10.png 

I would like an option to exclude a particular page from being included in the Top 10 > Slow Pages (Average) section.

 

Use Case:

I have a test with 10 virtual users, each signing into an app and doing different things. The problem I am facing is that the sign in page (or landing page after user signs in) is one of the slow pages and after a load test, 6-8 entries in the Slow Pages (Average) list will be filled with the sign in page data from different users and the Top 10 list would look like this:

  1. Scenario1: Page 0003 (Action XXXX)
  2. Scenario3: Page 0002 (Sign in)
  3. Scenario2: Page 0002 (Sign in)
  4. Scenario5: Page 0002 (Sign in)
  5. Scenario4: Page 0002 (Sign in)
  6. Scenario9: Page 0002 (Sign in)
  7. Scenario6: Page 0002 (Sign in)
  8. Scenario8: Page 0008 (Action XXXX)
  9. Scenario10: Page 0002 (Sign in)
  10. Scenario7: Page 0002 (Sign in)

Because of this, I think I am missing information related to other slow pages.

 

I would like to either:

  • Show "Sign in" page only once (the one that took the longest) in the list

OR:

  • After I see the above scenario, be able to exclude "Sign in" from the Top 10 list in subsequent test iterations.

 

Also from @AlexKaras:

"I think that it must be considered how to clearly indicate that the given report lists not Top 10 Slow Pages, but Top 10 Slow *Filtered* Pages, so that it is clear to report viewer that there are even more slow pages than those reported but for some reason these pages (provide names?) were filtered out."

I would like LoadComplete to support integration with versioning systems (like SVN, other systems might be welcome as well). Just like TestComplete has support for SVN.

 

Please vote if you would love to see this implemented as well.

 

 

 

I would like LoadComplete to be able to save / export request headers /bodies and response header/bodies?

 

Why? 

because we need to analyse specific requests and responses and want to use Excel for analysis.

 

2017-09-22_12-27-12.png

Currently, in the Test Editor, you can only select out of a few pre-defined Connection Speed settings. Better simulation of connection speeds would be possible if this could be updated to specifying specific Upload/ Download bandwith settings (in Kb/s, MB/s, etc).

 

E.g.

Download speed: 40 MB/s

Upload speed: 10 MB/s

 

2017-08-02_14-53-31.png

search on variable names

Status: Community Feedback Requested
by on ‎09-14-2017 06:41 AM

I want LoadComplete to be improved so that you can use search function on used parameter names.

Right now, when searching on a used parameter name, no results are returned.

 

2017-09-14_15-38-44.png

Use case:

we have multiple test environments, which have identical software running, but different environment settings (hardware, setup etc). When I record my tests on env1 I want to easily run it against env2.

 

You can see the used Hosts in scenario itself, but it's not easy to switch (currently you have to update all pages).

 

I want LoadComplete to support easy running the same test on different environment. How? By adding a column Host on the Test Editor window. In this field the user can enter the environment URL (or variable name) to easy switch between test environments and press Play.

 

For example:

env1 : www.testhost1.com

env2: www.testhost2.com

 

script is recorded on testhost1.com, then to run the test on testhost2.com only the field Host on Test Editor has to be changed from testhost1.com to testhost2.com (or variable names)

 

2017-09-08_16-12-54.png

To add custom headers to several / to all requests of the given scenario

Status: Accepted for Discussion
by Community Hero baxatob ‎09-04-2017 03:59 AM - edited ‎09-04-2017 06:40 AM

Use case:

 

The scenario contains x pages. Each page contains y requests.
E.g. I need to add a custom header to each first request on the each page.

 

The idea is to make an option to edit specific requests in a certain way. 

Something like this:

 

ideaLC.png

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

On the image above I want to select Requests 189, 191, 192 and then update them, adding some specific header: Parameter / Value