Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradient2Limit fixes #134

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open

Gradient2Limit fixes #134

wants to merge 3 commits into from

Conversation

udaysagar2177
Copy link

No description provided.

@udaysagar2177
Copy link
Author

@elandau could you please take a look at this?

@happyomg
Copy link

happyomg commented Jul 16, 2019

hi, bro, did u meet this situtation in issue #137 ?
i saw u use a final value 0.5 to degrade the gradient. how about using another smooth dynamic number which in (0,1) instead the final value ?
if u detect an drop, u cut down the gradient directly to 0.5. but if the dropped request is an accident, do u still want to cut it down?

@udaysagar2177
Copy link
Author

@happyomg afaict, drop translates to back-pressure from the downstream service. In that case, using 0.5 as gradient is probably not a bad idea. 0.5 aligns with other limiters like AIMD in this repo. Remember that there is a smoothing factor to absorb any intermittent shocks.

@udaysagar2177
Copy link
Author

@elandau do you have an estimate on when you would be able to review this? I am waiting on this since 8 days.

@udaysagar2177
Copy link
Author

@elandau In case you are waiting for the travis failure fix, the failure seems to be a transient one.

When we have a high limit, we skip adjusting the limit even when requests
start taking longer times to finish.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants