Code complexity analysis tools beyond cyclomatic c

2019-02-09 04:30发布

问题:

While cyclomatic complexity is a worthwhile metric, I tend to find it to be a poor tool for identifying difficult to maintain code. In particular, I tend to find it just highlights certain types of code (e.g. parsers) and misses difficult recursion, threading and coupling problems as well as many of the anti-patterns that have been defined.

What other tools are available to identify problematic Java code ?

Note, we already use PMD and FindBugs which I believe are great for method level problem identification.

回答1:

My experience is that the most important metrics when looking at code maintainability are:

  • Cyclomatic Complexity, to identify large chunks of code that are probably hard to understand/modify.
  • Nesting depth, to find similar spots (a high nesting depth is automatically high CC, but not necessarily the other way around, so scoring on both is important to look at).
  • Fan in/out, to get a better view of the relationships between methods/classes and the actual importance of individual methods.

When examining code that was written by others, it is often useful to include dynamic techniques. Simply run common usage scenarios through a profiler/code coverage tool to discover:

  • Code that is actually executed a lot (the profiler is great for this, just ignore the timing info and look at the hit counts instead).
  • Code coverage is great to find (almost) dead code. To prevent you from investing time in refactoring code that is rarely executed anyway.

The usual suspects such as any profiler, code coverage and metrics tool will usually help you with getting the data required to make these assessments.



回答2:

Google Testability Explorer checks for example for singletons and other static things which are bad smells in design. Metrics is an Eclipse plugin that measures almost every code metric known to mankind. I used and can easily recommend both.



回答3:

Sonar tries to identify "hot spots" of complexity and maintainability combining the results of various open source tools (including PMD and Findbugs). It integrates well with Maven and CI servers (especially Hudson).

EDIT by extraneon

There is a Sonar site available where a lot of open source projects are analyzed. I think this shows quite good how much rules get applied,and how far the drill down goes. You can of course also disable rules you don't find that interesting.

Here is an explanation of the metrics.



回答4:

I never used it, but I found this rather interesting and promissing:

http://erik.doernenburg.com/2008/11/how-toxic-is-your-code/

And I used this one and found it extremely helpful, because the nice visualization of dependencies

http://www.headwaysoftware.com/products/structure101/index.php



回答5:

The static analysis tools you already use are pretty standard. If you're using Eclipse, try looking here for more code analysis tools.

Emma provides analysis of code coverage, though this is really for testing.



回答6:

The tool NDepend for .NET code will let you analyze many dimensions of the code complexity including code metrics like: Cyclomatic Complexity, Nesting Depth, Lack Of Cohesion of Methods, Coverage by Tests...

...including dependencies analysis and including Code Rules over LINQ Queries (CQLinq) dedicated to ask, what is complex in my code, and to write rule. Around 200 default Code Rules are provided. They concern anti-patterns like the Singleton, detection of threading problems, detection of coupling problems like UI layer shouldn't use directly DB types...

A while back, I wrote an article to summarize several dimensions of code complexity: Fighting Fabricated Complexity