On 22 Sep 2008, at 16:37 , Holger Guhl wrote:
The project is for company CAST that has software for
software
quality assessment.
I know them, that's cool!
I have a simple question: Do you have more analysis
methods and
metrics that are not yet published with Moose?
You can find a pre-release of (some of) my current work here
http://www.iam.unibe.ch/~akuhn/d/Kuhn-2008-WCRE-SoftwareMap.pdf
And how about correctness of the measured values? The
class comment
states: "Right now only Number of MessageSends is computed in a
correct manner." On the first glance, I cannot see which of the
measured values could be wrong. Any hint on inappropriate
computation or improvement would be appreciated. The only thing I
see a bit confusing is the computation of two McCabe numbers
#cyclomaticNumber and #cyclomaticNumber2. Unfortunately, the class
comment does not tell enough.
Correctness of measured values is a big issue. The correctness of the
numbers that Moose produces is unknown. Even such simple measurements
as the number of classes or method invocations might be wrong! Why?
Moose uses the FAMIX model that has the aim to be language
independent, which can only be achieved at the cost of less
precision. Famix is this a lossy representation of software rather
than precise, think JPEG vs PNG. For that reason I suggested some
time ago to add an error range to all numbers, and started to take a
look at some of the numbers. That is why there are two diff McCabe
measurements. As far I recall, #cyclomaticNumber2 is more correct
than #cyclomaticNumber. I only looked at McCabe and found it is not
correct, so I guess other measurements need careful review too. But
alas, I did not have time to complete that work...
cheers,
AA