-
-
Notifications
You must be signed in to change notification settings - Fork 332
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use 'no-cache' when fetching server or update data #795
base: master
Are you sure you want to change the base?
Conversation
If we read the code carefully, none of these requests, even if cached, should change anything. The API path is just checked if response is okay and if there's server Id, then the HTML text is only checked for the CSP header for the purposes of the CSP workaround. Then you hit the code path: console.log("Using normal navigation");
window.location = server; And right there it's completely up to the web engine to load up the page, meaning its cache is probably broken, the web html has an ETag header, it should be revalidated with the request with If-None-Match, but it seems that his must not be happening then. If the web engine loaded latest html, it'd already be good, since the actual html has file hashes within the get path which'd prohibit using outdated assets. But yeah, I'm almost 100% sure this PR isn't gonna fix that bug (edit: apart from if the CSP workaround code path was exhibiting the same problem). |
What could help without touching native code as a workaround: console.log("Using normal navigation");
const navUrl = new URL(server);
navUrl.searchParams.append("t", +new Date());
window.location = navUrl.toString(); Except that it might not work with the redirect from the main URL to the /web/#/home.html thing... Alternatively, the Qt web engine could be set to cache type of |
I didn't carefully check the code when originally committing this. I was running off of the assumption that what Doxterpepper was suggesting here would fix things. I'm not very familiar with Qt in general. Would your proposed change require an internet connection on startup? If so that would be undesirable I think as some people have things set up to work internally as well in case of an internet outage for example. From my general understanding, I would have thought that this PR would have fixed the caching issues on the webplayer itself |
Well I mean "internet" as in connection to your server, it could be on LAN by all means, sure. It's only that if you didn't have that connection, that the app would be useless anyways there.
Tbh personally I am somewhat surprised there, I think that if browsers undesirably end up caching API responses like that, they should just add the no cache headers to the responses from the SERVER, otherwise more clients could run into such issues. With that said, it still wouldn't fix things here if the underlying web browser ends up being the one to load the page, unless such headers were inserted there somewhere... I think from Qt side using in-memory cache wouldn't be too horrible to keep the permanent disk usage down, and unused cache should be paged out anyways, but then again with ideally some headers from the server it shouldn't be an issue either. Given that this player project seems somewhat stale at accepting contributions too, I would suggest you to open an issue on the server perhaps to add Cache-Control headers to the server API responses and to the main index html page response too so that it's always re-validated, only keep cache allowed for the static assets that have the hash in url query... Would solve such issues once and for all imo, without any dirty tricks. |
Should fix #688. I've added a no cache attribute to everything that either uses the
/System/Info
api, or that only cares about the latest data anyway, as the requests are done once at the beginning of a session. I've looked and could only find the fetch requests already changed in the PR, if there are any more please let me know and I'll fix those as well.