You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
def warm_cache(value_dict, **opts):
"""
Generic function for warming the cache with values.
:param dict value_dict:
:param dict opts:
:return:
"""
# Attempt to lazy initialize the cache
if memo is None and not cache_disabled:
_init_cache()
if not cache_pipeline_store or not _redis_pipeline:
_init_cache_pipeline()
creation = time.time()
expiry = opts.get('expiry')
max_age = opts.get('max_age')
if max_age is not None:
expiry = min(x for x in (expiry, creation + max_age) if x is not None)
failure_count = 0
for key, value in value_dict.items():
if failure_count > MAX_CACHE_WARM_FAILURES:
break
try:
cache_pipeline_store[key] = (CURRENT_PROTOCOL_VERSION, creation, expiry, opts.get('etag'), value)
except Exception:
logger.exception("Exception occurred during cache pipeline execution.")
failure_count += 1
# write all the pipelined commands
try:
_redis_pipeline.execute()
return True
except Exception as e:
logger.exception("Exception occurred during cache warming.")
return False
When using the redis store provider, it would be nice to have a way to "warm the cache" that uses redis bulk operations.
I'm using the memo decorator basically to cache database lookups -- It's kinda slow to store one result at a time.
The text was updated successfully, but these errors were encountered: