Forum Discussion
It's been a LOOOONG time since I've had a chance to play around with LoadComplete... it was version 2 last time I touched it.
As for the two approaches that AlexKaras mentions, I've used both and, honestly, it depends a bit on what you're trying to do, the application under test, etc. As with functional testing in TestComplete, the needs of the project many times dictate the approach in creating the automation. As mentioned, there is coordination that needs to happen in the scenario with the traffic broken up into multiple pieces to make sure that you are doing things the same way every time.
As for the feature to turn a functional test into a load test, I have not used it. I've done all my work in the past directly in LoadComplete (or, when it was still part of the tool, TestComplete). The thing to remember is that LoadComplete is all about GET and PUT requests. The actual UI, clicking on the buttons, links, etc., really doesn't come into play. So, the feature mentioned is simply a way of taking a functional test and feeding it through the LoadComplete transponder. From what I understand, it's more of a convenience thing. You have a scenario created in TestComplete that mirrors a scenario you want to use for load testing. In the past, what is described in that article, was a more manual process. You would start the recording in LoadComplete then load up TestComplete and run the test. The integration that is present now just removes some of that manual stuff. But at the core, what you are doing is having your workstation send web traffic to your web server and having LoadComplete record that traffic for playback. Whether you do it by running a TestComplete functional test or manually is immaterial in this end result.
I do wish I had that feature back when I was doing this because the convenience factor would have been VERY nice. Recording a load test means a requirement of a lot of accuracy and repeatability. If I need to re-record a load test, I would have had to remember in the past EXACTLY what steps I executed and make sure I did them the same way EVERY time if I wanted to compare one set of load tests to another. If the set of transponder traffic changes between load tests, you can't necessarily compare the results 1 to 1. By having an automated script in TestComplete to build my load test, I could guarentee that EVERY time I needed to record the load test I could have the same traffic recorded.
AlexKaras, tristaanogre thank you both for sharing your thoughts on this.
I am pretty new to LoadComplete and I am particularly interested in transforming data driven TestComplete tests, to LoadComplete tests. Do I need to create a link between each GET and PUT requests to the data store? Or is this recorded by LoadComplete itself (during TC's test playback).
- AlexKaras8 years agoChampion Level 3
Mathijs,
There is a difference between (automated end-to-end) functional tests and load tests.
Quite often, functional tests are more of less sophisticated ones. They usually do a lot of verifications and the flow of the given test may change depending on the test data. (For example, the flow to purchase some general-purpose medicine may differ from the flow when a drug-containing one is requested. In the test the implementation may be done via an 'if' switch and test code may be branched appropriately.)
On the contrary, in order to be able to create as significant load as possible, load tests must be as simple as possible. This means that while it may be possible technically, the implementation of load test must avoid complex verifications, a big number of verifications, etc.
So usually load test just fires a request and checks that the server responded with the expected code. Verification that server responded with correct expected data is usually not performed and is left for functional testing. As an example, it is fine for the load test if the server responds with the items list that does not correspond to the requested filter condition. The goal here is to check that the server was able to process the given number of requests for the filtered items list and respond with some data. Obviously, functional test must fail if the returned list does not correspond to the requested filter.
That is why it is better to create several load scenarios for the flows that depend on test data than to try to incorporate business logic into load scenario and branch its flow based on the used test data.
Another difference:
Functional UI tests usually depend on the UI design. This means that you may be required to correct your test if, say, option button control was replaced with the combo-box one.
Because load tests just replay the traffic between the client and the server, they do not care about UI changes as long as these changes do not affect requests (both their number, order and data they exchange with) and expected server responses.
Which means that your load test may require no corrections if option button control is replaced with the combo-box one.
But at the same time, it is quite possible that some internal change done by the front-end developer in the script code executed in the browser on the client side, will change the number and the content of some request(s) sent by the browser. And this can happen without any impact on the UI.
The bad news here is that I talked to several front-end developers and they said that they are not aware about whether or not their corrections result in traffic change. Because such information is not provided by frameworks they use. (And they are not really interested about this.) This fact means that potentially you may need to correct your load test for every new build of your application. And the worst thing is that I am not aware about the tools/means that let you know in advance whether or not some change in UI results in traffic change. (This was the reason for the request I mentioned earlier.)
Considering the above, the approach with TestComplete looks promising because it: a) simplifies recording in LoadComplete; and b) ensures that recorded traffic corresponds to the one that is actually generated by the application.
The drawback is that complete scenario is overwritten and requires data correlation to be done a-new. Which may appear to be a time-consuming and not trivial task depending on the internal design and functional complexity of the tested application.
- tristaanogre8 years agoEsteemed Contributor
What AlexKaras said... totally... two different kinds of testing requiring two different strategies with two different sets of problems regarding maintenance, etc. The integration between the two, as mentioned, is simply a way of using TestComplete to record your load testing traffic.
One thing else I'd add... load testing is usually done on a fairly stable code base. You don't want a lot of development, bug fixes, etc., going on while you're load testing. You also don't want errors, etc, occurring while you're load testing. You want to be able to have a clean end-to-end flow on a scenario for load testing to make sure that all the traffic you're recording is as close to "real world" as possible to get accurate measurements on performance.- AlexKaras8 years agoChampion Level 3
> [...] load testing is usually done on a fairly stable code base.
On the other hand, you may consider it to be a good idea to execute load tests on some regular base during development to monitor how functionality changes affect performance of the server and back-end.
To some extent this task can be done with the use of code profiling (AQtime ;) )... but the actual implementation and support efforts might depend on your given environment and this is probably something that may be considered as well.
Otherwise, as tristaanogre Robert said... :)
- tristaanogre8 years agoEsteemed Contributor
LoadComplete takes care of all the transponder traffic. There's no need to create specific links or anything.
Related Content
Recent Discussions
- 3 years ago
- 3 years ago
- 4 years ago