Are there any static analysis tools that will repo

2020-06-17 04:47发布

问题:

I know blindly following any "best practice" can still lead to a stinking pile of crap that strictly adheres to the best practice. The SOLID principles are just that, principles. They don't apply to every situation but they are still very good heuristics for finding possible improvements in your code.

The downside to them is that they sometimes require a deep analysis of your source code to apply them. I, like most programmers, am constantly on the lookout for more efficient ways of doing things. So, I am curious if anyone has heard of an analysis tool that attempts to test for the application of SOLID principles (or lack thereof).

SRP The Single Responsibility Principle

A class should have only one reason to change.

OCP The Open-Closed Principle

Software entities (classes, modules, functions, etc.) should be open for extension, but closed for modification.

LSP The Liskov Substitution Principle

Subtypes must be substitutable for their base types.

ISP The Interface Segregation Principle

Clients should not be forced to depend upon methods that they do not use. Interfaces belong to clients, not to hierarchies.

DIP The Dependency Inversion Principle

Abstractions should not depend upon details. Details should depend upon abstractions.

-From Agile Principles, Patterns and Practices by Robert C. Martin.

回答1:

I don't think that automatic static analysis can determine if the principles are respected. To write such a tool you'd need to define formally what each concept means and have a way to check it against any code. How would you formalize the notion of a responsibility? I have personally no idea.

That said, you can have tools to help you detect the likelihood of violation. For example you could use code metrics such as number of methods per class, number of members per class to determine if a class is too big and therefore likely to violate SRP.

An exception might be the Liskov Substitution Principle. IF you define the contracts on all the methods (preconditions, postconditions, invariants) then you can check that a method redefining a method of a superclass don't strengthen the precondition, don't weaken the postcondition and respect the invariants of the superclass's method. I think that the tool ESC/Java performs those checks. Reading the wikipedia page about LSP more checks would have to be performed.



回答2:

My answer involves a .NET-specific product, apologies in advance, and maybe someone can suggest its non-.NET analogs.

I'd give NDepend a try and see if it can lead me to the violations of SRP and ISP by using metrics like:

  • number of methods per type types with
  • abnormally high numbers of methods
  • afferent/efferent coupling at the assembly and type level
  • other metrics, full list of metrics here

DIP and LSP violations may be harder to track down because they involve the programmer's intent. An analysis tool can identify the relationship between types, but how can it tell a situation where one class genuinely extends another from Square's inappropriately deriving from Rectangle? Or, that in a properly designed program, A should have depended on B and not the other way around?

OCP presents a different challenge because the extension/modification that the class should be open/closed to may not necessarily have already taken place.

However, if we believe that following SOLID leads to a more maintainable product (proving this claim scientifically is not what this question is about), then NDepend's Abstractness-Instability chart should give a good aggregate measure of how well the principles were followed for each software module. If they were, the module should have avoided the bottom-left corner of the chart, dubbed "The Zone of Pain". In that zone, the module is stable (not in a good way -- too many others depend on it, so it's difficult to change), but not abstract enough.