Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caching needed #15

Open
JerryWorkman opened this issue Jan 19, 2016 · 8 comments
Open

Caching needed #15

JerryWorkman opened this issue Jan 19, 2016 · 8 comments

Comments

@JerryWorkman
Copy link
Contributor

Some sort of caching of the HTTP request data would significantly improve performance.

Each variable evaluation triggers a new HTTP request in radiotherm. Each request takes 5 to 20 seconds on my thermostat. A cache of GET requests would provide much better performance.

@tubaman
Copy link
Contributor

tubaman commented Jul 20, 2017

I think caching is explicitly not supported

@TD22057
Copy link

TD22057 commented Sep 3, 2017

I agree w/ Jerry. The problem is that many home automation systems will poll all the data from the thermostat every interval - and they don't know that for this type of thermostat, that means multiple url requests. Making 1 request is much better than making n requests since the thermostat is so slow to respond. So it would be nice if caching could be turned on as an option so that the multiple requests for the same url don't need to be repeated. A simple time check should be sufficient I would think.

@tubaman
Copy link
Contributor

tubaman commented Sep 4, 2017

It's nearly just as easy to use a pre-existing cache solution like beaker

@JerryWorkman
Copy link
Contributor Author

JerryWorkman commented Sep 7, 2017 via email

@tubaman
Copy link
Contributor

tubaman commented Sep 22, 2017

Because caching could potentially add a lot of complexity. For example, do you want caching in RAM or on disk? Do you want caching across multiple processes? That will probably require a cache daemon like redis or memcached. Why build in limited caching that might not cover all the use cases when you can use a robust full-featured cache:

# setup radiotherm and beaker
tstat = radiotherm.get_thermostat('thermostat')
cache = CacheManager()
tstat_cache = cache.get_cache('tstat', expire=10)

# get cached temp
temp = tstat_cache.get(key='temp', createfunc=lambda: tstat.temp)

@tubaman
Copy link
Contributor

tubaman commented Sep 24, 2017

Alternatively, if you really want caching built-in, use Python-TStat instead.

@craftyguy
Copy link
Contributor

@tubaman I'm not familiar with beaker.. does it handle all of these cases you mentioned automagically?

For example, do you want caching in RAM or on disk? Do you want caching across multiple processes?

If not, then a simple approach, e.g. where each instantiation of the thermostat class has its own cache, would be trivial to implement and cover 99% of the use cases I am aware of where a user is just using one instance of the class to communicate with the tstat. Or, are there lots of folks all hitting the tstat from multiple processes/systems?

@skimj
Copy link
Contributor

skimj commented Apr 19, 2024

Passing in the optional model parameter helps significantly.

def get_thermostat(host_address=None, model=None):

Without it, an instantiation is preceded by a query to the device to get the model. (can be each POST/GET if the class instance isn't retained) As OP noted, these queries are slow.

In a way, it allows you to cache the model information in user space and save up to half the HTTP calls to the unit. It isn't a full caching solution but it has been good enough for me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants