Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix decoder_input_details bug #705

Merged
merged 2 commits into from
Dec 10, 2024
Merged

Fix decoder_input_details bug #705

merged 2 commits into from
Dec 10, 2024

Conversation

ajtejankar
Copy link
Contributor

Replace PrefillTokens with NextTokens

@@ -249,7 +249,7 @@ message Generation {
/// Request ID
uint64 request_id = 1;
/// Prefill tokens (optional)
PrefillTokens prefill_tokens = 2;
NextTokens prefill_tokens = 2;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we just remove the PrefillTokens proto from here if it's not used anywhere?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's used in a couple of places in CausalLM and Seq2SeqLM. I'll replace it with NextTokens

@ajtejankar ajtejankar merged commit 2af302d into main Dec 10, 2024
2 checks passed
@ajtejankar ajtejankar deleted the fix-decoder-input-details branch December 10, 2024 01:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants