Forum Discussion

info_5's avatar
10 years ago
Solved

Perform ALL visible events of some kind

Hi everybody,

I'm really new, not just to this forum, but also to TestComplete, automatic testing, testing in general, working life and its work routines, and so on…. so I might be a bit unsure about terminology (so please pardon the somehow cryptic subject, or if I missed the right keyword that would have found me an answer in the forum/FAQ/…) .





So I guess I best try to describe my question by desribing what I'm trying to implement:

Our company is offering the same piece of software in several languages. Let's assume that functionality and content are always the same, and all there is to do is making sure that any text visible to the user is in the appropriate language – but without recording or otherwise specifying every single occurence of text.



I assume that “all visible text” is a rather vague catch-all term, so I'm trying to reduce it to a special subquestion: 

The menu for preferences/options/settings is full of checkboxes to be ticked, dropdown menus to be selected from, fields to enter text into...and most of them do have one of those blue circles with a yellow 'i' next to them, which will display help text if you perform a click/mouse-over on them. Now my question: Does TestComplete offer a way of saying st. like “extract any help text that can be reached via a click/mouseover on some specified symbol”?

If yes, I’d try to run the original and the translated version of the software, and perform a pairwise string comparison between texts that were extracted from corresponding places of both interfaces. If the strings are equal, this is a potentially untranslated text…which is far from being a complete quality criterion, since equal texts actually can be translations (“Ok” is “Ok” in just about any language we offer), an different texts need not be translations, but it’s something I could start with.

If no…what might come closest to what I’m trying to implement?

 

Thanks in advance,

Florian

  • Hello Emmanouil,

    First of all let me hopefully* break any misconception you may have about automated testing in general. In my experience it seems common that people start out with the belief that automated tests should be driven to "find" bugs. While it is possible to do this I don't think that should be your goal, but more of a possible bonus. Ideally what we (at the company I work for) strive to do with automated testing is to maintain quality of our applications, or in other words make sure existing functionality does not change unexpectedly. While your automated testing may catch the occassional access violation it is never going to replace manual testing done by human beings. You may even design it to solve complex calculations and backwards-engineer functionality of key areas of your application, but you must ask yourself "is that any different from comparing the resulting values of said-calculation from version to version"? With that said I believe your goal is to check as much text, in as many languages as possible, within your application and compare it version to version. What I would suggest is using the .FindAll methods to retrieve a list of objects for each form, and then building your scripts to iterate each of these objects, store their text values, and compare them to the values of your previous application version. Most likely you'll want to perform data driven testing and use something like an excel spreadsheet to store the text-based values of each control for each form within separate sheets. This of course is not a simple task, but if done right it can make maintenance of scripts a lot easier. Essentially what I do is this:

    1. Write data-driven scripts to drive the application

    2. Compare various forms, grids, and control values to the previously stored excel values

    3. Generate VBS files to update the stored excel values

    4. Compile a summary of test results and email



    When checking each form you'll likely need to check each objects ClrClassName,VCLClass, ObjectType, etc. properties returned by the .FindAll function and use this to determine what text based property you want to compare (.Text, .Caption, etc.).



    Furthermore if this method becomes too involved or difficult to maintain you can use image compares. Image compares are a very simple way of finding differences on forms version to version but they can also become very time consuming to update, maintain, and review. For example if you are comparing images of various forms and between compares you happened to change your windows theme it's likely that every image compare will fail/report differences. Or if you change OS versions you'll get the similar results where now every image needs to be updated. While Test Complete provides methods to store and even update all image stores this became a very time consuming and impractical task for our testing department to handle. What I ended up doing with image compares:

    1. Created routines to compare images of specific objects (such as a grid or panel that controls sit on) thereby limiting the amount of area each image compares and avoiding some suttle OS/Theme differences.

    2. Created a routine to copy the masking from one image to another

    3. Created a routine to generate VBS update scripts to replace the old images from my image stores directory with the newly updated image which contains the copied masking from the original image



    More information on image compares here and here.

1 Reply

  • Ryan_Moran's avatar
    Ryan_Moran
    Valued Contributor
    Hello Emmanouil,

    First of all let me hopefully* break any misconception you may have about automated testing in general. In my experience it seems common that people start out with the belief that automated tests should be driven to "find" bugs. While it is possible to do this I don't think that should be your goal, but more of a possible bonus. Ideally what we (at the company I work for) strive to do with automated testing is to maintain quality of our applications, or in other words make sure existing functionality does not change unexpectedly. While your automated testing may catch the occassional access violation it is never going to replace manual testing done by human beings. You may even design it to solve complex calculations and backwards-engineer functionality of key areas of your application, but you must ask yourself "is that any different from comparing the resulting values of said-calculation from version to version"? With that said I believe your goal is to check as much text, in as many languages as possible, within your application and compare it version to version. What I would suggest is using the .FindAll methods to retrieve a list of objects for each form, and then building your scripts to iterate each of these objects, store their text values, and compare them to the values of your previous application version. Most likely you'll want to perform data driven testing and use something like an excel spreadsheet to store the text-based values of each control for each form within separate sheets. This of course is not a simple task, but if done right it can make maintenance of scripts a lot easier. Essentially what I do is this:

    1. Write data-driven scripts to drive the application

    2. Compare various forms, grids, and control values to the previously stored excel values

    3. Generate VBS files to update the stored excel values

    4. Compile a summary of test results and email



    When checking each form you'll likely need to check each objects ClrClassName,VCLClass, ObjectType, etc. properties returned by the .FindAll function and use this to determine what text based property you want to compare (.Text, .Caption, etc.).



    Furthermore if this method becomes too involved or difficult to maintain you can use image compares. Image compares are a very simple way of finding differences on forms version to version but they can also become very time consuming to update, maintain, and review. For example if you are comparing images of various forms and between compares you happened to change your windows theme it's likely that every image compare will fail/report differences. Or if you change OS versions you'll get the similar results where now every image needs to be updated. While Test Complete provides methods to store and even update all image stores this became a very time consuming and impractical task for our testing department to handle. What I ended up doing with image compares:

    1. Created routines to compare images of specific objects (such as a grid or panel that controls sit on) thereby limiting the amount of area each image compares and avoiding some suttle OS/Theme differences.

    2. Created a routine to copy the masking from one image to another

    3. Created a routine to generate VBS update scripts to replace the old images from my image stores directory with the newly updated image which contains the copied masking from the original image



    More information on image compares here and here.