Misused design patterns

2020-05-19 05:59发布

Are there, in the canonical Gang of Four list, any design patterns that you often find misused, misunderstood or overused (other than the highly debated Singleton)? In other words, is there a design pattern you would advise to think twice before using? (And why?)

15条回答
看我几分像从前
2楼-- · 2020-05-19 06:27

The observer pattern is pretty useless in C# because it has events.

查看更多
劫难
3楼-- · 2020-05-19 06:27

REPOSITORY PATTERN

Most people start using this pattern right after reading the Domain Driven Design book by Eric Evans.

How many folks here have seen repositories constructed like data access objects?

查看更多
劳资没心,怎么记你
4楼-- · 2020-05-19 06:28

Only use a pattern when its called for. You can't predict the future, so while you might put a pattern in to make the design flexible, what happens when the product takes a different direction and your pattern becomes the very thing that's keeping you from implementing what your users want?

Keep designs as simple as possible to start with. When you know more about how your design needs to change, use the appropriate pattern and not before.

查看更多
该账号已被封号
5楼-- · 2020-05-19 06:30

Factory Patterns...

I was parachuted into a project before where every single MyObject in the system had an equivalent MyObjectFactory for generating new instances. There was no concept of abstraction or extended classes... just plain old ClassX & ClassXFactory.

And no-one could explain why... "It was just the way things had always been done"

查看更多
Juvenile、少年°
6楼-- · 2020-05-19 06:33

The only one (besides the aforementioned Singleton and its partner in crime, the Factory) wouldn't be a GoF, it would be setters and getters when applied to an object's native properties.

Setters and getters applied to member variables are functionally identical to public member variables. A getter without a setter is more like a public final member variable--but at that point why not just use a public final member variable, they do no more harm...

The only difference is that you "could" intercept the call and override it, but people rarely do. More often it's used as a crutch for procedural programmers to avoid OO programming (which is the real reason it's an anti-pattern).

With a setter and/or getter you are still exposing your inner member structure to the outside world (for instance, you'll have to refactor other classes if you find you need to change a int to a long) and you are almost assuring that some code that should be inside your object is instead being placed outside.

There are a few exceptions I can think of:

Setters used to simplify an objects construction. Sometimes it's necessary to create an object then set other values in afterwards. These values should be immutable (you shouldn't be able to call set twice) for safety.

Getters used to access contained objects. Since the contained objects are usually able to insure their own integrity, sharing them is great. Setters are generally bad in this case, you don't want an object with a specific state swapped-out right underneath your nose, it makes assuring your own integrity much more difficult.

Java Beans used for screen components: Yeah, can't figure out a better way to implement these "property balls". Reflection comes in handy for this component, the patterns are useful--it's kinda hacky but works.

DAO/DTO Bean objects. Honestly I think these are an iffy usage of the pattern, but they are the pattern. It makes manipulation of the properties via meta-data instead of code much more difficult than it should be since it has to be reflective. The beans properties are always tied to some outside source (database format, data transfer format, component properties, ...) so why are we duplicating the work of defining each part?

Edit: Stolen from kyoryu's comment, brought up to the post because It's really a perfect summary of what I was saying and could be missed in the comments. Needed since not everybody seems to get the concept that adding accessors to the language only codifies a bad OO design pattern:

Short version -

if (account1.balance > 1000)
{
    account1.balance = account1.balance - 1000;
    account2.balance = account2.balance + 1000;
}; = BAD CODE. 

account2.deposit(account1.withdraw(1000)); = GOOD CODE. 

The second one doesn't require accessors... – kyoryu (Slightly modified by bill k because I have a little more room than he did in his comment).

The second one moves the test and some other math inside Account rather than duplicating it throughout the code every place you might make a transfer.

Just to belabor the point EVEN MORE, note that with the "GOOD CODE" style it's pretty obvious that the output of .withdraw could be a Transaction object that contains information about the entire transaction including its success, source and destination and logging ability. How would this have occurred to someone who writes their code in "BAD CODE" style?

Also how would you refactor BAD CODE to even use such an object? It's just a mess.

查看更多
放我归山
7楼-- · 2020-05-19 06:34

Actually, I would say design patterns in general are overused when a KISS (Keep It Simple, Stupid Keep it Short and Simple) solution is all that's needed.

Design patterns are great for making a system flexible, but at the cost of making the implementation more complex. This is only a worthwhile trade off when the flexibility will provide a practical advantage.

查看更多
登录 后发表回答