Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Issue with LightevalTaskConfig.stop_sequence Attribute When Unset #462

Open
ryan-minato opened this issue Dec 19, 2024 · 0 comments
Open
Labels
bug Something isn't working

Comments

@ryan-minato
Copy link
Contributor

Describe the bug

When running aimo_evals.py from community tasks, an error occurs indicating that the Tokenizer does not accept None as input. This issue stems from how stop_sequence is handled in LightevalTaskConfig. Since most tasks in lighteval explicitly define stop_sequence, this problem may have gone unnoticed.

self.stop_sequence = tuple(self.stop_sequence) if self.stop_sequence is not None else None

Proposed Solution

Set the default value for stop_sequence to an empty list ([]) instead of None. Testing shows that this change resolves the issue.

self.stop_sequence = tuple(self.stop_sequence) if self.stop_sequence is not None else tuple()

To Reproduce

lighteval accelerate \
    "pretrained=gpt2" \
    "community|aimo_progress_prize_1|0|0" \
    --custom-tasks "./community_tasks/aimo_evals.py"

Expected behavior

The test should run successfully without any errors.

Version info

This issue occurs in a version installed directly from the main branch using the following command:

pip install -e ".[dev]"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant