-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Caching needed #15
Comments
I think caching is explicitly not supported |
I agree w/ Jerry. The problem is that many home automation systems will poll all the data from the thermostat every interval - and they don't know that for this type of thermostat, that means multiple url requests. Making 1 request is much better than making n requests since the thermostat is so slow to respond. So it would be nice if caching could be turned on as an option so that the multiple requests for the same url don't need to be repeated. A simple time check should be sufficient I would think. |
It's nearly just as easy to use a pre-existing cache solution like beaker |
Really?
Why depend on another caching solution?
…On Mon, Sep 4, 2017 at 12:29 PM, Ryan Nowakowski ***@***.***> wrote:
It's nearly just as easy to use a pre-existing cache solution like beaker
<http://beaker.readthedocs.io/en/latest/caching.html>
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#15 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABsbQOv13XGf0t6G_9ABK9MQZ-r4zhOBks5sfCVWgaJpZM4HHZqi>
.
|
Because caching could potentially add a lot of complexity. For example, do you want caching in RAM or on disk? Do you want caching across multiple processes? That will probably require a cache daemon like redis or memcached. Why build in limited caching that might not cover all the use cases when you can use a robust full-featured cache: # setup radiotherm and beaker
tstat = radiotherm.get_thermostat('thermostat')
cache = CacheManager()
tstat_cache = cache.get_cache('tstat', expire=10)
# get cached temp
temp = tstat_cache.get(key='temp', createfunc=lambda: tstat.temp) |
Alternatively, if you really want caching built-in, use Python-TStat instead. |
@tubaman I'm not familiar with beaker.. does it handle all of these cases you mentioned automagically?
If not, then a simple approach, e.g. where each instantiation of the thermostat class has its own cache, would be trivial to implement and cover 99% of the use cases I am aware of where a user is just using one instance of the class to communicate with the tstat. Or, are there lots of folks all hitting the tstat from multiple processes/systems? |
Passing in the optional model parameter helps significantly. radiotherm/radiotherm/__init__.py Line 63 in 614a251
Without it, an instantiation is preceded by a query to the device to get the model. (can be each POST/GET if the class instance isn't retained) As OP noted, these queries are slow. In a way, it allows you to cache the model information in user space and save up to half the HTTP calls to the unit. It isn't a full caching solution but it has been good enough for me. |
Some sort of caching of the HTTP request data would significantly improve performance.
Each variable evaluation triggers a new HTTP request in radiotherm. Each request takes 5 to 20 seconds on my thermostat. A cache of GET requests would provide much better performance.
The text was updated successfully, but these errors were encountered: