Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Duplicate request_id breaks the engine #10583

Closed
1 task done
tjohnson31415 opened this issue Nov 22, 2024 · 0 comments · Fixed by #11036
Closed
1 task done

[Bug]: Duplicate request_id breaks the engine #10583

tjohnson31415 opened this issue Nov 22, 2024 · 0 comments · Fixed by #11036
Labels
bug Something isn't working

Comments

@tjohnson31415
Copy link
Contributor

Your current environment

The environment is not relevant.

Model Input Dumps

No response

🐛 Describe the bug

The request_id field used in the LLM Engine is assumed to be unique, but it is a parameter that can be set by the caller. After #9550, the request_id is configurable even by a caller to the OpenAI Server.

I originally encountered this issue while investigating the cause of negative metrics in #10430. I found that sending two requests concurrently with the same request id can cause the negative metric error. Example bash using curl:

curl http://localhost:8000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
        "model": "meta-llama/Llama-3.2-3B-Instruct",
        "request_id": "id",
        "messages": [{"role": "user", "content": "Tell me a long story:"}],
        "max_tokens": 1024,
        "min_tokens": 1024,
        "temperature": 0
      }' &

sleep 0.5

curl http://localhost:8000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
        "model": "meta-llama/Llama-3.2-3B-Instruct",
        "request_id": "id",
        "messages": [{"role": "user", "content": "Tell me a long story:"}],
        "max_tokens": 1024,
        "min_tokens": 1024,
        "temperature": 0
      }' &

The fix in #10430 properly prevents the crash, but only one request gets a response and the other connection hangs open.

Other users have reported other bad behavior that looks to be caused by duplicate request_ids:

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant