How do you orginize your endpoints & request tests?
Our REST resources keeps growing & growing, when it was 20 requests for an endpoint it was easy to build & navigate. As new requests are added to the endpoint the list its getting unruly & harder to manage.
Some feature suggestions:
- On Projects, Search the endpoint for requests ("put' or "auth"), this will filter down to the desired rquests, however I can do nothing with the results. I would like to be able to from the context menu & Add to Test Case.
- On Projects the ability to sort.
- On Projects better sorting, ordering & the option for nesting - see Postman.
I don't think I'm doing anything special in any way.
--Test Suites (6)
- Tests Cases (60+)
- Tests (100s)
I'm currently building Story Tests, End 2 End & Regression tests. Each time I update the definition I get a few more REST Services (64 to date) to build tests for, This can be a few story tests, then duplicate into regression &/or E2E.
The issue is manageability. The product does not clearly indicate/highlight newly added REST Services, at least prior to 3.0 they would list at the bottom - so I have to hunt.
My tree view is pretty wild now, it just seems that there should be a better process.
Do you mean to say that the development of the rest services is in progress and you get new services or update in the schema each time which resulting you to redo the building / updating of requests (in case of updates)?
This is the typical issue in almost projects when development and test automation are being developed in parallel, especially from the scratch.
Certain things go well as designed and some not. So they have to be reworked to make them work.
IMO, it would be good time to start for automation at least the wadl/swagger definition is fixed/concrete. Otherwise, will end up in re-doing. Also people should realize that it is not that test automation alone needs rework, development team is re-working on the particular service. So, am afraid if there is way avoiding it.
Now, only thing to be focused is how to minimize the efforts to handle the situation to cater to services updates in the test automation.
And there is no pre-defined / standard way to adopt this. Many be people can come up with tools and ideas over some time and experience. It can depends on different parameters such as
- how the tests are designed
- is the new update requires additional properties (in the request)
-- mandatory or optional
- is the new update removed/renamed existing properites
- how many request impacts tests
- if working closely with development team, analyze definition changes before implementing the services and
- and quick feedback of the impact
Overall if one has good programming skills, he/she can explore to see if automatic update of the requests in the projects is possible (only if it is faster than manual update).
Also it can be explored to see if the project itself be generated automatically with just inputing swagger/wadl file.
"it would be good time to start for automation at least the wadl/swagger definition is fixed/concrete"
I can't wait on anything, I get a new requests to test daily, that request is concrete & complete at that moment but since it's a totally new api service there are cases where parameters, request bodies change due to ongoing dev.