Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

insert a start check to verify that crate back off factor is less than 120 and it does not conflict with WQ retry #518

Open
chicco785 opened this issue Aug 13, 2021 · 5 comments · May be fixed by #573

Comments

@chicco785
Copy link
Contributor

have a look here, i am not sure what's the impact on queue workers.

It should be okay, but there must be a way to configure it since w/ WQ you also have retry intervals, so the admin should be able to configure the Crate client's backoff factor to make sure the total amount spent by the client retrying a connection is less than the minimum retry interval in WQ...

Originally posted by @c0c0n3 in #503 (comment)

@github-actions
Copy link
Contributor

Stale issue message

@daminichopra
Copy link
Contributor

daminichopra commented Oct 19, 2021

Hi @c0c0n3 @chicco785 , I would like to contribute on this issue. Please assign

Also, I have been investigating on comment and proceeded with debugging this File . Please confirm my understading as we need to verify the total amount spent by the client retrying a connection is less than the backoff_factor and also to add warning for same.

@daminichopra daminichopra linked a pull request Oct 25, 2021 that will close this issue
5 tasks
@daminichopra
Copy link
Contributor

daminichopra commented Oct 26, 2021

Hi @c0c0n3 @chicco785 , I would like to contribute on this issue. Please assign

Also, I have been investigating on comment and proceeded with debugging this File . Please confirm my understading as we need to verify the total amount spent by the client retrying a connection is less than the backoff_factor and also to add warning for same.

I have added PR for this issue as per my understanding. Please let me know for any modification in PR :)

@c0c0n3
Copy link
Member

c0c0n3 commented Jan 4, 2022

@daminichopra I've added my comments to #573, thanks alot!!

@pooja1pathak
Copy link
Collaborator

Hi @c0c0n3 I am continuing on PR #573.

@c0c0n3 c0c0n3 linked a pull request Jun 16, 2023 that will close this issue
5 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants