I am developing an app that creates a large number of small, immutable Java objects. An example might be:
public class Point {
final int x;
final int y;
final int z;
.....
}
Where it is likely that many instances of Point will need to refer to the same (x,y,z) location.
To what extent does it make sense to try to cache and re-use such objects during the lifetime of the application? Any special tricks to handle this kind of situation?
When it becomes a problem. Otherwise you're just creating a useless layer of abstraction.
Either way, you could easily implement this with a
PointFactory
that you call to get aPoint
, which always returns the same object instance for any given x, y and z. But then you have to manage when the points should be removed from cache because they wont be garbage collected.I say forget about it unless it's an actual issue. Your application shouldn't depend on such a caching mechanism, which would allow you to add it in later if necessary. So maybe just use a factory that returns a new point instance very time for now.
How many instances will share the same coordinates, how many will exist at the same time, and how many will be discarded?
Reusing the objects only has benefits if a significant percentage of live objects at one time are duplicates (at least 20%, I'd say) and overall memory usage is problematic. And if objects are discarded frequently, you have to construct the cache in a way that prevents it from becoming a memory leak (probably using soft/weak references).
As for most cases: it depends.
If your object is rather complex (takes a lot of time to instatiate) put can be expressed in a string, it makes sense to create and load them through a static factory method.
This also makes sense if some representations of the object are used more often than others (in your case maybe Point(0,0,0))
e.g
The problem you are likely to have is making the object pool light weight enough to be cheaper than just creating the objects. You want to the pool to be large enough that you get a fairly high hit rate.
In my experience, you are likely to have problems micro-benchmarking this. When you are creating a single object type repeatedly in a micro-benchmark, you get much better results than when creating a variety of objects in a real/complex application.
The problem with many object pool aproaches is that they a) require a key object, which costs as much or more than creating a simple object, b) involve some synchromization/locking which again can cost as much as creating an object c) require an extra object when adding to the cache (e.g. a Map.Entry), meaning your hit rate has to be much better for the cache to be worth while.
The most light weight, but dumb caching strategy I know is to use an array with a hashcode.
e.g.
Note: the array is not thread safe, but since the Point is immutable, this doesn't matter. The cache works on a best effort basis, and is naturally limited in size with a very simple eviction strategy.
For testing purposes, you can add hit/miss counters to determine the caches effectiveness for you data set.
Remember that caching these objects will influence concurrency and garbage collection in (most likely) a bad way. I wouldn't do it unless the other objects that refer to the points are long lived too.
It sounds almost like a textbook example of the Flyweight pattern.