google common cache - default value of maximumSize

2019-05-13 19:13发布

I just found Guava by searching for a cache API (it fits perfectly for my needs). But one question arose on reading the wiki and Javadoc - what are the default values of settings the CacheBuilder can take? The Javadoc states "These features are all optional" and "Constructs a new CacheBuilder instance with default settings, including strong keys, strong values, and no automatic eviction of any kind."

In my opinion, a good default for maximumSize would be relative to Runtime.getRuntime().freeMemory();

At the end I want a cache that uses the memory available on a given system. So I need an eviction strategy that asks how much freeMemory() is available (probably relative to Runtime.getRuntime().maxMemory())

2条回答
干净又极端
2楼-- · 2019-05-13 19:30

I got me questioning the same thing and could not find anything on the web for that. So I made this very primitive test. I wrote a piece of code that creates a LocalCache with the most basic setup (no maximum size, no eviction policies, nothing) and in an infinite loop puts stuff in the cache. And monitored it through VisualVm to check the heap usage.

import com.google.common.cache.Cache;
import com.google.common.cache.CacheBuilder;

import java.util.concurrent.TimeUnit;

public class CacheTest {
    public static void main(String[] args) {
        Cache<String, String> cache = CacheBuilder.newBuilder().build();
        int counter = 0;
        while(true){
            cache.put("key"+counter++,"value");
            System.out.println("size:"+cache.size());
        }
    }
}

As you can see from the image below, the memory usage grows to the maximum available space and becomes constant. I waited for a few minutes and no OutOfMemoryError ocurred. What happened is that after a few seconds one new entry is added to the map so there will be probably an error in the future.

Heap Dump

Conclusion: You don't have to set the maximumSize value, but I suggest you use some kind of eviction policy (expireAfterAccess or expireAfterWrite) to clean up the cache and avoid an OutOfMemoryError. And also to avoid degrading the performance of your cache.

查看更多
不美不萌又怎样
3楼-- · 2019-05-13 19:39

Actually, free memory isn't all that great of a metric for cache eviction. The reason is because of garbage collection. Running out of free memory may just mean that it is now time for the garbage collector to run, after which you'll suddenly have lots of free memory. So you don't want to drop stuff from the cache just because you have a lot of accumulated garbage.

One option is to use softValues(), but I would strongly recommend against that, as soft references can really hurt production performance.

The right thing to do is to carefully select a maximumSize which in essence bounds the total amount of memory your cache will consume. If entries take up variable amounts of space then you can use maximumWeight instead to model that.

查看更多
登录 后发表回答