Forum Discussion
One of the biggest issues we face is keep data in good states across a multitude of databases. Reset scripts / completely nuking and setting back up isn't really an option, and there are multiple applications hitting all of these.
Would love to hear some others thoughts on ways to alleviate this.
Also, would love to know how others utilize one applications APIs to generate data for testing another applications UI.
cunderw wrote:
One of the biggest issues we face is keep data in good states across a multitude of databases.
My usual preference is to generate semi-random unique identifiable data and use them as test input.
An example of such data may be like this:
Invoice20190221_1500_jwofiSW9rjdf0d239r
Where
-- Invoice - is a primary item type identification prefix;
-- 20190221_1500 - date/time stamp to make data unique;
-- jwofiSW9rjdf0d239r - random string of random length to make data more varying.
In almost all cases test data like above may be used without significant risk to get unexpected failure because of data duplication and make it possible to quickly and easily understand when and where the given data were entered into the application.
As a side benefit, if one executes such tests against the same database for a long time, he will get a relatively large database that can be used to measure application's performance when working with large data volumes.
Related Content
- 4 months agoSmartBear
- 2 months agoD0UG
- 5 months agoFelecia628
Recent Discussions
- 7 hours agojstaehlin
- 24 hours agoAivanitskiy