Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Option to buffer/queue a bridge's outgoing requests #4252

Open
K4LCIFER opened this issue Sep 3, 2024 · 3 comments
Open

Option to buffer/queue a bridge's outgoing requests #4252

K4LCIFER opened this issue Sep 3, 2024 · 3 comments
Labels
Feature-Request Issue is a feature request

Comments

@K4LCIFER
Copy link

K4LCIFER commented Sep 3, 2024

Is your feature request related to a problem? Please describe.
Services often rate limit requests, so if RSS-Bridge makes too many requests too quickly, all subsequent requests from that Bridge will be dropped until the rate-limit lifts. This is partially mitigated by the implementation of the CACHE_TIMEOUT option, which specifies the length of time that a feed stores the previously fetched results before making a new request, but it does nothing to prevent requests from individual feeds from the same bridge being made in quick succession.

Describe the solution you'd like
There should be an option available for bridges that specifies the minimum amount of time between requests from any feed. This can be tuned to the rate limit of any service.

It would function like the following: If a request is outgoing (the respective feed's cache timeout has expired), a separate timer will be set which must expire before a second outgoing request can be made, so if a second outgoing request is received before the timer expires, it must wait in a queue for the time to expire before it's sent out. If many requests are received all at once, they are buffered and sent out individually as the timer cycles.

Describe alternatives you've considered
None.

Additional context
I've encountered this, for example, with the Spotify bridge. If one has many artists to be fetched, they will start to be constantly rate limited.

@K4LCIFER K4LCIFER added the Feature-Request Issue is a feature request label Sep 3, 2024
@dvikan
Copy link
Contributor

dvikan commented Sep 3, 2024

Due to current technology limitations we cannot do async operations as you describe.

@K4LCIFER
Copy link
Author

K4LCIFER commented Sep 3, 2024

What current technology limitations are you specifically referring to?

@dvikan
Copy link
Contributor

dvikan commented Sep 3, 2024

Currently we dont have queue for async tasks/jobs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature-Request Issue is a feature request
Projects
None yet
Development

No branches or pull requests

2 participants