Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FIX] Minor errors in gemini_api.py and internvl2.py. #502

Merged
merged 2 commits into from
Jan 17, 2025

Conversation

skyil7
Copy link
Contributor

@skyil7 skyil7 commented Jan 16, 2025

This PR fixes two minor errors in models.

Initialize self.response_persistent_file in gemini_api.py

  • When we use GeminiAPI with continual_mode=False, It causes ValueError since self.response_persistent_file is not declared.
    if self.continual_mode:
    self.response_persistent_folder = response_persistent_folder
    if not os.path.exists(self.response_persistent_folder):
    os.makedirs(self.response_persistent_folder)
    self.response_persistent_file = os.path.join(self.response_persistent_folder, f"{self.model_version}_response.json")
    if os.path.exists(self.response_persistent_file):
  • So I initialized self.response_persistent_file with an empty string ("") to prevent error.

Wrong variable name in internvl2.py

  • In internvl2.py, self.model.chat() in line 316 requires num_patches_list variable.
  • But in line 315, we declare num_patch_list instead of num_patches_list.
    else:
    pixel_values = None
    num_patch_list = None
    response, history = self.model.chat(self.tokenizer, pixel_values, contexts, gen_kwargs, num_patches_list=num_patches_list, history=None, return_history=True)
  • This seems a typo error, so I fixed it.

BTW, Happy New Year!

@pufanyi pufanyi self-requested a review January 17, 2025 02:19
Copy link
Collaborator

@pufanyi pufanyi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hiiii!!! Thank you so much for your contribution!!! Happy New Year!!!

@pufanyi pufanyi merged commit 721ee92 into EvolvingLMMs-Lab:main Jan 17, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants