I am trying to record and playback tests to create a new Lead record for our Microsoft CRM web application and experiencing a 400 Bad Request error with the POST request to the InlineEditWebService.asmx. I found this article that goes into detail how to fix this and similar issues in LoadRunner but it doesn't translate well for me to LoadComplete: https://community.dynamics.com/crm/b/crmperformancetesting/archive/2018/01/03/dynamics-crm-365-perfo...
Has anyone else experienced issues with MS CRM and creating load tests in LoadComplete? I already sent a request to Smartbear for assistance through the tool but haven't heard back yet. We are on a tight timeline to get load tests created and run them to ensure our application can handle and increase of 200 users in the next month. Any help is greatly appreciated!
Most probably, you need to carefully analyze recorded and replayed traffic and adjust automatic data correlation made initially by LoadComplete. (Like it is said in the article) It is difficult to say something more definite without seen your test project and test log.
> I already sent a request to Smartbear for assistance through the tool but haven't heard back yet.
How did you do this?
For the initial problem submission you must use this form: https://support.smartbear.com/message/?prod=LoadComplete
I submitted a request through the tool through the prompt that displays when test errors/warnings occur. I searched Google for some LoadComplete articles on modifying auto-correlation so I will start there and post back if I find anything out. Thanks for your response @AlexKaras!
Data correlation in LoadComplete is described here:
(And I really would recommend to familiarize yourself with the documentation as it is just great and contains a lot of useful information.)
If you like, I may try to get a look at your project and try to help but obviously cannot promise anything definite.
The basic scheme is to Validate your recorded scenario. This will execute it for one user and store all data to the test log. Then you should start from the first failed request, compare recorded and played-back data and figure-out what difference caused request to fail. Correct request appropriately (missed cookie, data from the previous request(s), etc.) until it passes. Then move to the next failed request and so on.
@AlexKaras - I figured out that I am getting a 400 Bad Request because the simulated SOAP request is getting cut off prematurely, resulting in closing tags not being present and an "unclosed token" error message (see screenshots). Do you think this could be related to an auto-correlation or needing to add correlations manually? I looked through the auto-correlations that were created and there weren't any with this specific webservice so I'm not sure where to go from here. We have reached out to our sales engineer to see if we could have a working session but I'm stumped.
Well... I hope that the assistance from Support will be most useful.
Unfortunately, I am afraid that I cannot say a lot regarding those screenshots... At the moment, all I can say is:
-- I don't think that truncated body of the request is because of data correlation. My opinion is that either it was some intermittent network problem during replay or some problem with LoadComplete. Hopefully, Support will be able to answer this;
-- I take it that the previous requests were replayed successfully (i.e. with the same return code as recorded and with replayed data that logically match recorded data);
-- It is strange that two cookies in the request were missed during replay. Most probably they need manual correlation. To do this, search all previous responses for the Set-cookie header that sets these cookies, create a proper Data Selectors to get these cookies and add Data Replacers to the request #36;
-- Why request #36 has collapsed child elements? What are they? I don't remember that I have ever seen this, so just wondering;
-- It looks suspicious that the same tokens (second and third) were used during replay. Not sure that this is correct;
-- At the same time, it is strange that two extra cookies were added during the replay. Not sure that this is correct either.
I think that you should carefully go through the simulation log, compare recorded data with simulated and adjust requests so that at least no extra data is added to the simulation when compared with recorded.
Considering the article that you referenced in your initial message, it will require considerable efforts to create a correct simulation, unfortunately.