ContributionsMost RecentMost LikesSolutionsRe: Test Summary under Plan cycle Hello skakkar , Thank you for participating in our SmartBear community. The test summary page doesn't feature much customization. It should be used more as a hub to select different labels or components that you are testing. We recommend using Jira dashboard gadgets that provide custom ways to see pass/fail executions (e.g chart, table etc.) https://zephyrdocs.atlassian.net/wiki/spaces/ZFJCLOUD/pages/21757976/Test+Metrics+Gadgets Hope this helps! Re: Zephyr for JIRA and TestComplete In addition to hkim5s response, which is correct, the integration between Zephyr for Jira and TestComplete is native to TestComplete. You can link a TestComplete test to a Zephyr for Jira test execution so that wherever it is being executed (local machine, Jenkins etc.) the results will feed into Jira automatically. https://support.smartbear.com/testcomplete/docs/working-with/integration/zephyr/index.html Re: REST api Thank you for posting to our Community Forum. Please submit this issue to support along with more information on what is incorrect from the spec, and the project XML file that you are using in ReadyAPI so that our team can have a closer look. Support Ticket Link: https://support.smartbear.com/message/?prod=ReadyAPI Have a great day! Re: Export GET response to CSV Thank you for posting to our Community Forum. To output the GET Response information to a CSV file, we recommend using the Data Sink Test Step within the TestCase structure. You can create a "Property" that pulls from the API Response (either a part or whole) and export it in multiple formats including CSV. Documentation: https://support.smartbear.com/readyapi/docs/soapui/steps/data-sink.html Have a great day! Re: load test with 30 threads for 60 seconds with a Test case that contains 10 record Thank you for posting to our Community Forum. The number of calls to a service in the indicate load test scenario depends on a couple of things: The average response time of each webservice call in the Test Case The configuration of the Data Source under the "Load Test" Option To expand on the second point, a Data Source can be used in two different ways. Each row of a file can be allocated to a single virtual user (this is known as a shared Data Source) or each virtual user iterates through every row. The second option would potentially lead to more API calls in the load test duration. Here is our documentation on the Data Source Options: https://support.smartbear.com/readyapi/docs/soapui/steps/data-source.html#options Sorry that I cannot provide a more precise answer with the provided information. Hopefully it will provide a good start in your analysis. Have a great day! Re: load test with 30 threads for 60 seconds with a Test case/input file that contains 10 records. Thank you for posting to our Community Forum. The number of calls to a service in the indicate load test scenario depends on a couple of things: The average response time of each webservice call in the Test Case The configuration of the Data Source under the "Load Test" Option To expand on the second point, a Data Source can be used in two different ways. Each row of a file can be allocated to a single virtual user (this is known as a shared Data Source) or each virtual user iterates through every row. The second option would potentially lead to more API calls in the load test duration. Here is our documentation on the Data Source Options: https://support.smartbear.com/readyapi/docs/soapui/steps/data-source.html#options Sorry that I cannot provide a more precise answer with the provided information. Hopefully it will provide a good start in your analysis. Have a great day! Re: How to terminate the test when reaching the request sample to some 1000 request in Load UI Pro? Thank you for posting to our Community Forum. You can use a "Test Case Run" assertion on the Load Scenario with the option "Stop Test on Failure" checked so that the Load Test terminates if the TestCase runs past 1000 times. LoadUI Documentation on Assertions: https://support.smartbear.com/readyapi/docs/loadui/configure/assertions/about.html Have a great day! Re: ReadyAPI 2.3.0 integration with GIT (GIT plugin V1.0.2) gives error. Thank you for posting to our Community Forum. You need to update the git plugin to the latest version 1.2.0. We currently have an issue with the automated update plugin button in our plugins manager so I recommend the following. Uninstall Git Plugin 1.0.2. Select "Load Plugin from File" Select ready-api-git-plugin-1.2.0.jar in C:\Users\<UserName>\.soapui\plugins Restart ReadyAPI Have a great day! Re: <BUG?> Check for existence JSON Node no longer works as expected in 2.3.0 Hey lgermain315, There was an issue with the recent deployment in the maintenance build that didn't include a fix for when the JSON element was null. That issue has been rectified and I have confirmed it this morning (EDT). Please download the maintenance build once again. https://support.smartbear.com/downloads/readyapi/maintenance Hey JoostDG and anyone else who stops by this post, Our development team puts some bug fixes into our maintenance build so that we can get the resolution out to the customer quickly. The fix will then be included in the next release of ReadyAPI (2.4). For the time being, please continue using 2.3.0-m and keep an eye out for when 2.4 is released. Have a great day! Re: No valid SoapUI NG license exists during bamboo integration Thank you for posting to our Community Forum. Please go to the Bamboo Service in the Windows Services and set the service to sign into the same user account that you activated the ReadyAPI license on. If you still have further issue, please submit a ticket here so our team can have a look: https://support.smartbear.com/message/?prod=ReadyAPI Have a great day!