Forum Discussion

philAtkin's avatar
philAtkin
New Contributor
15 years ago

Coverage methodology

I've used AQTime for quite a few years, generally for performance tuning, and found it excellent.  More recently I've been using it for code coverage analysis, and while I can say that it works, it does seem rather clumsy and I wonder if my methodology is at fault.

For the record, I'm using AQTime 6.50.498.86, running under XP 32-bit; tests are written using NUnit 2.5.1 and the library I'm testing is in C# 2005 under .Net 2.

My basic problem is that the speed of profiling, even with the "Light" coverage profiler, is very very slow.  To run just over 200 tests with line level profiling on a debug build of the library takes around 6 hours.  (I find that the profiler results don't tie up accurately with the library sources if the code is optimised, hence the use of a (slower) debug build.  With no profiling, the run takes about 6 minutes.  My code may be unusual because I'm processing images, so lots of tight inner loops executed many times.)



My methodology is something like the following: I want to achieve 100% line coverage on most routines.  When I don't achieve this, I want to assess each routine to see if the coverage is adequate.  If it is, I needn't consider it further; if not, I need to introduce more tests.  This is inherently an iterative process as the library and the test suite (and the setup of the profiler) change - so execution time is significant.  So what I do is to create various 'Exclude routine' areas in the following categories:


  • Coverage complete (100% achieved)

  • Coverage adequate (<100%, but on inspection the coverage is adequate)

  • Coverage not required (areas of the library I don't need/want to test, such as debugging helpers)

  • Coverage undetected but adequate (certain types of functions, such as constructors for generic classes, don't appear to be profiled correctly, so I class the coverage as 'undetected')

  • Line info not found (the profiler warns before the start of the run that it can't profile certain routines; I add these to this exclude category to avoid those warnings)



Now, on each run I examine the results.  Whenever I'm satisfied with the coverage I move a routine into either 'complete' or 'adequate'.  I also move routines into the other three areas as I think fit.  Gradually, the amount of code actually being profiled decreases to the point where the running times are manageable.  Finally, all the routines end up in one or other of the exclude areas and the profiler refuses to run!
This is a long, laborious, error-prone process and it takes many days.  At the end of it I end up with a sort of 'non-report' from the profiler, because each of my conclusions about my code results in the profiler being turned off for each routine!  Further, it's quite difficult to check that my allocations are correct (for example, I can check that each of my 'coverage complete' routines is still covered 100% by disabling that exclude area, and verifying that every routine still produces the 100% result, but it's rather more tricky to check that there are no other routines that belong in that area that have ended up in another area).  Moving routines between the areas is rather tricky. 



I'd like to stress that I do all this only because I need to due to the long running times.  I don't really understand why the profiler does not do for itself what I am doing manually: once a routine has achieved 100% coverage then surely the profiler has no further need to analyse it?  This conclusion could be reached automatically during a run; I can only do it between runs.



So my basic question is: is there a better way to do all this?

  • Hello Phil,





    Currently, Light Coverage Profiler does not give noticeable benefits when profiling a .NET application - it works fine only for native applications. We will try to improve the situation in the future.





    When profiling by lines, AQtime instruments every code line by adding special service code that is used to collect profiling information. So, if you have some fast code lines that are executed many times, you will get a noticeable overhead because of AQtime's service code functioning. This is what happens in your case.

    At the moment, we can't suggest any solution to improve the profiling speed. Sorry.
  • philAtkin's avatar
    philAtkin
    New Contributor
    Thanks Alex - at least I can stop wondering if there's a better way.



    Phil