I have thread A, inserting a new element to Guava Cache, and because of the Size policy, the cache will evict element associated with key Y.
Unfortunately, the removal process R of Y takes long, and during the time Y is being process by R (already evicted but still in R), there is another thread B trying to get data associated with key Y.
Basically, R will try to update the database for the key Y, and while that value is not updated, thread B try to access the database for value associated with key Y, which is still the old value.
Question is: how can I block thread B from accessing element with key Y while R is doing its job?
You stated Guava Cache, but there is no code example, so I give a general answer.
For the below I assume that you have a "loading cache" aka "self populating cache" schema.
Solution 1: Properly design your cache interactions and database transactions.
The update process invalidates the cache entry, as soon a transaction is started on it.
If you remove the entry from the cache and then start the transaction you introduce a race condition.
Solution 2: Use caches that block out concurrent operations on the same key/entry.
Take a look on ehcache Blocking Cache. Or take a look on cache2k where the blocking behaviour is the default.
But, however, you need to do additional locking on the loader level by yourself. E.g. like the example below.
Solution 3: Do the locking by yourself on top of the cache and wrap all cache operations. E.g. with something like:
You can also look at the BlockingCache implementation from ehcache and take some inspiration from there.
Have fun!