-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Should not overwrite the existing RSS feed if nothing changed #147
Comments
Can we compute a hash or unique identifier for a build? As in building the site for the same files would result in the same hash being computed. This way we just compare the new build hash with the current build hash, if they are the same, no need to overwrite any RSS. This hash can be computed concatenating all What do you think? |
@soapdog I guess that works, but isn't that overcomplicating things a bit? Only problem is that by the time we generate the RSS, the So, another approach would be to read the existing RSS (if any) before wiping out the To solve this we would require to read a remote RSS (from the deployed site), but that's way too much effort and adds a requirement on supplying a deployment site URL config and add a dependency on having internet connection, plus slower build time due to the download. This is not good. Unless we can think of something better, I guess it is better to leave this as is. |
Actually, if we compute a hash (as you said) and save it to the |
Also comparing the RSS to see if everything remain the same but the date in non-trivial if you're doing it right by comparing node contents and attributes. Since Harmonic doesn't maintain a database of built data, its hard for it to detect if it is building the same things over and over or new things. |
@soapdog Right. But we do have the posts and pages' data -- we could |
@UltCombo I like this! Let me try to build a POC... |
Currently, if a new build is generated without any changes, the RSS's publication date still changes. I believe this is not good.
The text was updated successfully, but these errors were encountered: