Forum Discussion

willange's avatar
willange
Occasional Visitor
4 years ago

How to get more verbose messaging in automation test executions

I'm using Zephyr Scale Cloud to upload the results of my pytest testsuite to Zephyr (using the junitxml format).  So far, this seems to be working fairly well and I'm able to get helpful error info on the failing tests.  I'd like to capture the output of my passed tests and have it added to those test executions as well. 

Right now, all the passing tests don't get any additional info at all beyond the fact that they passed while the failed tests get good info about their failures.  

I have the info I want in my xml file.  I ran pytest with the rpx flag and junit_logging=all option, so there is a good deal of information in that output file about the passed tests, but the endpoint: "https://api.zephyrscale.smartbear.com/v2/automations/executions/junit" seems to pretty much ignore everything that's not a failure message.


Is there a way to hit this endpoint so that I'll get more info on my passed tests as well, or perhaps a different way I can write to the xml output file and accomplish the same thing?

  • Hi willange 

     

    Great post!

     

    my initial thought process was on the XML file, "is there an attachment tag enabled?. " 


    And can you parse the log file then as part of your automation flow. 

     

    But you seem to have answered that, so its more on the status of your tests right? 

     

    We'd need to make sure we know exactly the definition of that API and how it is intended to be used.  

     

    I am not sure on the behavior or whether it is indeed expected behavior of the API to behave this way. 

     

    I would check with support just to dbl check 

     

    https://smartbear.atlassian.net/servicedesk/customer/portal/42

     

    I know from other Test management tools in smartbear this has worked. 

     

    However, I am also conflicted on this being useful as attaching log data to execution instances of your test case for passed tests/executed test will mean well lots of data duplicated from your automation framework to your TM solution.

    Whether this is useful comes into question. 

    That could lead to performance issues down the line too especially if you are continuously testing a suite tests see a daily/weekly/biweekly regression suite. 

     

    I am interested and if you have a solution already, please do share! 

     

    If you hear back from support also share with us the outcome of the api behavior. 

     

    Kind Regards

     

    Vinnie