ContributionsMost RecentMost LikesSolutionsswagger-codegen-cli-v3 Python response_type='Object' possible bug I'm using the swaggerapi/swagger-codegen-cli-v3 Docker image to generate Python client code from a simple live API. However when I use the DefaultApi class in api/default_api.py I get deserialization errors for response_type='Object'. I resolved / worked around these errors by changing the generated code after the fact with this shell command: # fix what appears to be a bug in the SDK portion of the generated code sed -i.orig -e "s/response_type='Object'/response_type=object/" ./out/python/api/default_api.py By SDK portionin the comment I refer to the generated client class that hides the HTTP response details, as per https://github.com/swagger-api/swagger-codegen.This problem does not occur with api_client.py/ApiClientwhich one level lower than api/default_api.py/DefaultApi. By changing the generated code to read response_type=object, the DefaultApi class returns the deserialized data from a GET unchanged, which is what I wanted for my client script. It no longer trips up over 'Object' which is not a Python type anyway, which is why I think this may be a code generation bug. Re: Scripts library redefine Solved via two other posts. https://community.smartbear.com/t5/SoapUI-Pro/Setting-script-library-path-for-command-line-execution/m-p/136007#M30863 which pointed to: https://community.smartbear.com/t5/SoapUI-Pro/script-lib-location-into-testrunner-bat-file/td-p/23369 SmartBear Support provided the answer in the second post above: -Dsoapui.scripting.library=<your script library path> Adding that tothe arguments to testrunner.bat called from my Gradle build file worked. Note thatthe path should be an absolute path,as a path relative to the directory from which testrunner.bat was invoked did not work. Absolute path worked though. Cased closed (for me anyway). Re: Scripts library redefine I echo Kiril's question. Specifically, how can I set the Script Library directory when calling the SoapUI test runner (testrunner.bat)? I don't have the option of setting it via the ReadyAPI IDE. The alternative to redefining the Script Library directory is to copy our own scripts into the ReadyAPI bin/scripts folder, whichthen becomes a maintenance issue as those scripts should be cleaned up at the end of the run without deleting any of the SmartBear-provided script files there. So it is best to set the Script Library on each call to testrunner.bat. How can this be done? Re: Removing breakPoint elements from test case XML files Hi nmrao. Fortunately testsexecution via testrunner.bat is notaffected by breakpoints. Theproblem to be solved is simply thatthe accumulated breakpointstake up more and more space in the test XML files - sometimes there are more lines for breakpoints than anything else in the test! Just looking to ensure that it is okay to remove them from the XML files (via scripts), rather thansomehow removing them via the ReadyAPI IDE. (On that point,it looks like<breakPoint> XML elements remain in the XML file even after the breakpoints are removed in the IDE). Re: Removing breakPoint elements from test case XML files Hirichie. Indeed I am considering writing a script to load the test case XML files andwrite them back to disk with the <breakPoints> elements removed.Basically automatingthe effect of editing the XML file in Notepad++, deleting the <breakPoints> elements and saving the file, but for hundreds of files not a few. You are right to warn about the character set - I've had issues with scripts loading/saving test case XML files in the past, where I neglected to load as UTF-8 and got strange characters when saving back to disk. Thanks for the reminder! So I will try doing this at some point, not sure how soon (busy with other things). But will remember to explicitly specify the charset when saving and loading. Thanks, Rob Removing breakPoint elements from test case XML files We have a lot of SoapUI test cases that have accumulated many <breakPoints> elements in their test case XML files asdifferent users have debugged through the test cases onvarious occasions over many months. Some test cases havehundreds of such breakpoints, which are now irrelevant. Is it safe to simply remove the <con:breakPoints>...</con:breakPoints>elements from the test case XML files? This can be done in a bulk operation over large numbers of tests, much more efficiently than having individuals edit the tests in ReadyAPI and remove the breakpoints manually. Appreciate any insights on this. Thanks. Rob Unconditionally executing cleanup steps in failing tests Test cases have Teardown scripts that are unconditionally executed, even if the test failsand stops executing part way. This is useful to ensure that testsfree up any resources they allocated earlier in the test before it failed. However, this encourages all teardown activities to be coded in Groovy, which obscures the symmetry between resource setup in SoapUI test steps and resource teardown in Groovy. Itmakes it harder toautomatically check (via scripts) for symmetry between resource setup and cleanup. Desired feature: identifysome "cleanup" sequence of steps in each test that would executeunconditionally even if the test stops mid-way because of a failure. This would be akin to the "catch" in Java try-catch-finally. The analogy could be taken further in that if a step failure or other error occurred during the cleanup steps, the Teardown script would itself execute unconditionally just as it does today. Benefit: hoistand simplify resource cleanup actions out of the Teardown groovy script into the test case itself as real test steps that can be seen in the IDE.Makes it easier for automation developers to codecleanup steps and verify they are present (opposed to Teardown scripts which tend to hide and be forgotten). Also makessetup/cleanup actions more amenable to analysis by scripts for the purpose ofenforcing verification rules and generating metrics. Thanks.../rob Re: failOnError versus failTestCaseOnErrors Thanks for the link, though I find the explanation on that page a bit confusing: https://support.smartbear.com/readyapi/docs/soapui/ui/case.html Abort on Error Stops the test run when a test step fails. Fail Test Case on Error Marks the test case as failed if a test step fails. Note: This optiondoes notstop the test run. Although Abort on Error immediately stops execution of a specific test case upon an error, I know that SoapUI will continue executing subsequent test cases (i.e. that setting doesn't stop the overall project run). In contrast, it seems "Fail Test Case on Error" just means that the test case will be marked as failed because a test step failed, but it keeps executing steps after the failed one. So it seems that Fail refers to the final status of the test case but Abort refers towhether it continues executing. What would happen if Abort on Error was TRUE but Fail Test Case on Error was FALSE? If a test step failed, what status would the test case have once itaborted? failOnError versus failTestCaseOnErrors SoapUI Pro 1.8.0 has these two TestCase Options which sound like they do the same thing: Abort on Error: "Abort test if an error occurs" Fail Test Case on Error: "Fail TestCase if it has failedTestSteps" I assume the former Aborts the test case if some unexpected internal erroroccurs (which invalidates the test and for which no assertions would have been defined), whereas the latter Fails the test case if one of the test stepsfails in a normal way (e.g. assertion failure). Is that the case? Related question:what is the difference between a test that Aborts and a test that Fails? Is the distinction only detectableat the level of Groovy code inspectingthe status of some object? E.g. WsdlTestCaseRunner has methods isCanceled and isFailed but no mention of isAborted. Please clarify. Thanks. Re: Option to gracefully terminate and fail SoapUI project upon first test failure Already did. Support told me to request a new feature, sent me here. Thanks.../r