-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Description
As I griped about on twitter, I'm looking for a cache with async bulk-loading of keys, while permitting a logging context (in the loading code) that comes from the requesting call (permitting zipkin-like tracing).
Ben suggested by DM that a getAll(keys, func) method on AsyncCache would fulfil that need, which would be great (although he also mentioned he hasn't needed it personally, as he uses request-scoped DI... does that mean in Ben's setup the loader-code has access to the request-scoped DI context...? If that's a thread-local thing I don't think that would work for my multi-threaded code...).
This might be asking too much but I think that 'ideal' behaviour would be for implementations of AsyncCache to help guard against cache stampedes by only passing keys to func for values that aren't already in-flight. This could mean that the signature of the getAll method would look something like this:
CompletableFuture<Map<K, V>> getAll(
Iterable<? extends K> keys,
Function<Iterable<? extends K>, Map<K,V>> mappingFunction func)
Incidentally I tried to sign the CLA for this repo, as specified in the contributing guidelines, but could only get this error on clicking 'I agree':
