dsaracini
14 years agoNew Contributor
Metrics
Hello,
I've only been using Code Collaborator for about a month, and I really like it. It has given me the power to see information that I've never had before (in relation to the code review process). I can now examine things like review rates (LOC reviewed per hour) etc. But, now that I have some of this information, I'm not really sure how my team is doing... or how they are doing compared to others (aka best practices).
In other words, are my guys still rushing through reviews? (note: I suspect they are) Or are they wasting precious time?
Is there any place where SmartBear has some suggested metrics to compare again? If not, I would love it if some people would share some of the things that they track for their teams? How's anyone every taken the time to try to create metrics to compare one release against another. In other words, Total LOC changed/Total Review time? And try to correlate that against defects reported in the release?
TIA,
David
I've only been using Code Collaborator for about a month, and I really like it. It has given me the power to see information that I've never had before (in relation to the code review process). I can now examine things like review rates (LOC reviewed per hour) etc. But, now that I have some of this information, I'm not really sure how my team is doing... or how they are doing compared to others (aka best practices).
In other words, are my guys still rushing through reviews? (note: I suspect they are) Or are they wasting precious time?
Is there any place where SmartBear has some suggested metrics to compare again? If not, I would love it if some people would share some of the things that they track for their teams? How's anyone every taken the time to try to create metrics to compare one release against another. In other words, Total LOC changed/Total Review time? And try to correlate that against defects reported in the release?
TIA,
David