-
-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] cache_data
expire time
#201
Comments
Interesting idea. Given the current function is straight up dumping the data (as YAML) in the file (source) there is no room for metadata. That means it would need to take the |
I was taking a look at this function. I wonder if we should have a version that takes the name of a function, such that the function doesn't get run if the cache can return the data instead? I think my potential use-case is slightly different though. I'd like a generic way to cache the results of expensive function calls. eg. if I have an expensive puppetdb query or ldapquery and I don't need to worry if the data is eg up to 5 minutes old. |
I was looking for this exact thing as well. I've been able to create a crude version by using exported resources because I only have one node that has the "query" class applied and several other nodes that can realize the resource. However this isn't super intuitive code and my cache timeout=frequency of puppet run on "query" node. |
Using the cache_data function as starting point I created a new function, cache_function which does what @alexjfisher and I would like, cache the results of other functions with a configurable timeout. I've never done rspec testing and barely done any ruby coding so I'm going to leave this here for now so other can at least starting using it and suggest improvements. If I find the time I'll try to learn how to do proper testing and submit a complete pull request but I'm not going to complain if someone else beats me to it.
|
I wrote an implementation too, but cached in memory as I wasn't sure how to safely have multiple jruby puppet processes using the same cache files. Not sure if it's worth taking further or now, but I'll leave it here in case anyone's interested. # frozen_string_literal: true
require 'benchmark'
Puppet::Functions.create_function(:'cache_function', Puppet::Functions::InternalFunction) do
dispatch :cache_function do
scope_param
param 'String[1]', :function
optional_param 'Array', :args
optional_param 'Integer[0]', :expiry
end
def cache_function(scope, function, args = [], expiry = 300)
stacktrace = Puppet::Pops::PuppetStack.stacktrace
file, line = stacktrace[0]
key = generate_cache_key(function, args, file, line)
result = nil
from_cache = false
time_in_seconds = Benchmark.realtime do
if (result = fetch_from_cache(key))
@cache_hits = cache_hits + 1
from_cache = true
else
@cache_misses = cache_misses + 1
result = scope.call_function(function, args)
store_in_cache(key, result, expiry) if result
end
end
Puppet.info("Function `#{function}` took #{(time_in_seconds * 1000).round}ms in #{file}, line:#{line}, from_cache:#{from_cache}, total_hits:#{cache_hits}, total_misses:#{cache_misses}")
result
end
def cache
# This cache is per environment and per puppet/jruby instance
@cache ||= {}
end
def cache_hits
@cache_hits ||= 0
end
def cache_misses
@cache_misses ||= 0
end
def generate_cache_key(*args)
Digest::SHA2.hexdigest args.to_json
end
def fetch_from_cache(key)
expire_cache
return unless cache[key]
cache[key][:value]
end
def expire_cache
Puppet.info("Expiring from function cache. Current size: #{cache.size}")
cache.each do |k, v|
cache.delete(k) if v[:ttl] < Time.now.to_i
end
Puppet.info("New cache size size: #{cache.size}")
end
def store_in_cache(key, result, expiry)
cache[key] = {
ttl: Time.now.to_i + expiry,
value: result
}
end
end |
Affected Puppet, Ruby, OS and module versions/distributions
What behaviour did you expect instead
Could an additional parameter be added to the
cache_data
function setting the max lifetime of the file? This would let me trivially rotate certain resources to a new value over time.Any additional information you'd like to impart
This could have interesting interactions with facter's new caching infrastructure.
The text was updated successfully, but these errors were encountered: