Forum Discussion
I see you keep editing your post... but I'm not sure what the problem is. Object Spy/Object Property Viewer are not directly related to being able to use data from one Keyword Test in another (it's been quite a number of years since I've played with TestComplete 7 but I'm sure Object Spy is what you're looking for with regards to investigating an object).
If the answer I gave was unhelpful, let me know what you still need and I'll try to help.
Thanks a lot for your input so far.
I've edited my post several times as I'm trying to describe my issue as clearly as possible (hasn't been very successful so far ;)
1. I start recording a test, Test A. During this test an asset value is created and I checkpoint this value and keep on recording.
2. I keep recording Test A and at some point need to insert the the value above in a text box - and I would like my test to do this automatically when I re-run Test A.
This is because the value of the asset created in test step 1 automatically goes from, e.g. 1 to 2 to 3 to 4 etc every time this test step is re-run
3. Another issue is that during the creation of this asset, a description of the asset must be entered in another text box - this could e.g. be abc1000 or anything else.
As this description DOES NOT automatically increases from abc1000 to abc1001 to abc1002 every time this test is re-run, I need a way to make this happen if possible?
I've considered options such as a cleverly created variable, a test parameter or a data loop in which the description increments are loaded from an Excel sheet.
As I admittedly am a TestComplete newbie, any test "recipes", as detailed as possible, are warmly welcomed.
I can upload screen shots if my problem description still appears unclear.
Have a nice weekend!
Best regards,
//Camilla
- tristaanogre6 years agoEsteemed Contributor
So, you're not passing values from one test to another, you're basically, within the same test, want to take a value that you "collect" at one point in the test and use it in another.
Again... Set Variable operation is how you're going to do this.
So, based upon your steps, you will need to do this stuff AFTER you complete the recording., or, when you get to these points in your recording, Pause your recording and add the necessary operations, edits.
1) When the asset value is created as per your step one, use Set Variable Value operation to set a variable to the asset value as it is created. In anticipation of item 3 below, you would also use Set Variable Value to set ANOTHER variable as the description of the asset.
2) When you get to the point where you need to insert the value in a text box, whether you use the Keys methods or SetText method to enter the data in the text box, there is a parameter on the method which you can set to be of "Mode" variable at which point you select the variable indicating in item 1 above as the source of the data
3) As with item 2, again, you would use the Variable mode of the Keys/SetText method parameter and select the second variable as your entry into the text box.
There is no need for a test parameter, data driven loop, or anything particularly clever. Think og it in the same way as you would if you were manually testing it. You create the asset and, on a piece of paper, you jot down the asset value and the asset description. Then, when you get to the relevant points of your test, you reference your piece of paper and use those values to enter the data into the application. Variables are your "piece of paper" in this situation where you note your values and then use them later.
- cmbdnne6 years agoOccasional Contributor
Hi again,
Thanks for taking the time to help finding a solution to my problems.
However, I find that I might not have described my problem adequately - or I'm too much of a TestComplete newbie to be able to implementing your clever suggestions. So in the hope that you won't roll your eyes to much I'll give it one more try:
Overall, I use TestComplete to document a series of procedures in MS Navision, most of which involves creating assets that automatically are assigned unique ID numbers and also are added a range of other information.
So, I'm recording a test in which an asset in created and in this process I need to enter a short piece of text in a testbox which describes the asset, in this case abc1 (filling out this box is mandatory to moving on with the testing procedure). When the asset has been created the testing involves adding a lot of other information to the asset and is concluded with posting the asset. So, this test is rather long but the recording isn't the problem.
My problem arises when I re-run my test upon completing my recording as TestComplete almost at the end of my test comes up with a textbox warning me that an asset with that aforementioned asset description, abc1, already exists!
In order to continue re-running my test, I need to actively close this textbox by clicking "Ok" whereupon the re-run of my test continues but subsequently, this issue isn't flagged red in my log which reports that the test has been re-run without any errors or warnings.
My best guess would be that MS Navision, when re-running the test, creates a new asset with a new unique ID but with the same asset description.
Entering the asset description appears as a "SetText" step in my recorded test. Maybe the value cant be defined by clicking the ellipsis?
So: Is it in any way possible to make TestComplete assign the new asset another asset description, preferably abc2, when re-running the test so that I'll be able to re-run the test without interferring?
Thank you very much in advance!
- tristaanogre6 years agoEsteemed Contributor
OK... totally different problem. My assumption earlier was that the asset description was something auto-generated by the application undertest and that you needed to track it through the test case. This is a problem where the application under test has built in checks to prevent creation of duplicate data records. So, each time through the test, you need to make sure that the asset description is unique.
So... there are several different ways that this can be resolved, some more involved than others, some that may involve outside resourcing. And it all is going to depend somewhat on what is going to work best for you. Here are the high level descriptions.
1) This is my preferred way, if possible. Your test environment should be treated as a "clean room" where you have control over all aspects of your environment, including the database against which your application is working. Before your test case is run, you would "reset" your environment to clear out any conditions, data, information, etc., that may be artifacts from previous tests and experimentations. In the past, I've done this by scripting an SQL restore from backup so that, when my tests start, they start with completely reset data and I don't have residual information. It doesn't need to be a 100% "clean" database, just reset to a known starting point. As you've discovered, test automation is very sensitive to initial conditions at the start of the test. If there is already data present when you create the test, the automation (unless you code it otherwise) will assume that the same data is present upon each test run. This is a more involved solution and requires you to have a level of access to your application under test, the data against which it runs, etc., that is beyond just a simple user but more on a sysadmin role. You can either do these things yourself or request others to do these resets for you on a scheduled basis.
2) Second best option, IMO, is to create your data during your test using some sort of unique numbering scheme. A good way of doing this is to utilize the current date and time to create a unique number that you can then use in that initial text box when you enter the asset description. So, for example, if you do something like this as a "Code Expression" in the SetText parameter, this will give you a unique asset description every time your test is run
'abc' + aqConvert.DateTimeToFormatStr(aqDateTime.Now(), '%Y%m%d%H%M%S')
At the time I'm writing this, this will enter into your description field the value abc20180523080725 being that I'm posting this on May 23rd, 2018, at 08:07:25 AM Eastern US Time. Everytime you run your test, a new number will be generated creating a unique asset description. The bonus of this is that you can see in your database a history of test assets created. Now, if you need to use that name at various points through your tests, use the "Set Variable Value" operation to set a variable equal to that code expression and then just use that variable in the appropriate parameters throughout your test case.
3) Third option is to keep track of a counter. Several ways you can do this. Optimal is to keep the counter in some central location so that, no matter what machine you run against, you will always get the "next" value. Some sort of centralized INI file, XML file, or something that, when you start the test, you read the counter value into your test and append it to the end of your asset description using a code expression similar to what I describe in point 2 but, instead of that aqConvert blah blah blah, you would reference the variable value. After you read that counter into your test, increment it in that external source so that, next test run, you'll get a unique value. We do this for some of our tests where we store that counter in an SQL data table. If the external option won't work for you, you could do the same with a persistent project variable. The draw back to that is that, if you run the test on a second machine, you still have the risk of having duplicated numbers because the persistent variables are machine specific.
4) This final option is the least desirable but it basically replaces the manual step you're doing by automating it. Add to your test code to detect if that warning message is present. If it is, have the code click the OK button to proceed. This is probably the simplest solution but, IMO, the least desirable because you are basically circumventing the application's built in checks against duplicate data and adding unnecessary steps to your test case. This can be developed as part of the recording process or you can go back in after the fact and add the steps in.
So... these are the 4 options that I can see... there may be other ways but in my experience they would simply be variants of this. As mentioned, optimally, I'd prefer option 1 because then you have absolute control over your test environment and will make sure you're always testing the EXACT same thing each time rather than having residual artifacts left over from previous test runs. Option 2 is a VERY close second. 3 and 4 are OK, but they retain a higher risk of problems.
Hope this helps give some more guidance.
Related Content
- 2 months agoSahall0308
- 2 years agolalit_singh
- 9 months agokathir_43
Recent Discussions
- 59 minutes agoJacobjacob44