Skip to content
This repository has been archived by the owner on Feb 4, 2025. It is now read-only.

Missing Model Size and Embedding Dimensions for new models #53

Open
Pringled opened this issue Oct 1, 2024 · 1 comment
Open

Missing Model Size and Embedding Dimensions for new models #53

Pringled opened this issue Oct 1, 2024 · 1 comment

Comments

@Pringled
Copy link

Pringled commented Oct 1, 2024

Hi,

Two models that were uploaded recently (https://huggingface.co/minishlab/M2V_base_glove and https://huggingface.co/minishlab/M2V_base_output) do not have Model Size and Embedding Dimensions on the leaderboard. However, they both have this information displayed correctly on the model card and have working safetensors, and calling get_model_parameters_memory does return correct results for the models ((102, 0.38) and (8, 0.03)). I was wondering how we can fix this.

Thanks in advance!

@Pringled
Copy link
Author

Pringled commented Oct 4, 2024

I think this is happening because of the raise defined in https://github.com/embeddings-benchmark/leaderboard/blob/main/refresh.py#L218. Our model does not have a hidden_dim/n_positions defined in the config.json. However, in my opinion you should still be able to extract the model size and embedding dimensions if those are available via the safetensors, and this raise should not prevent that from happening. For example, our model does not have a max sequence length limitation, so that key does not make sense.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant