-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CrossEncoder gives OSError with newest Transformers version #3129
Comments
Hello! This seems to be related to multiprocessing used to convert the model from Either way, the common fix for this multiprocessing issue is to wrap your CrossEncoder loading under a from sentence_transformers import CrossEncoder
def main():
model = CrossEncoder("cross-encoder/ms-marco-MiniLM-L-6-v2")
query = "How many people live in Berlin?"
docs = ["Berlin is the capital of Germany", "Berlin has a population of 3.6 million people", "Berlin's area is 891.8 km²"]
# Compute similarity between the query and the documents
scores = model.predict([(query, doc) for doc in docs])
print("Query:", query)
for doc, score in zip(docs, scores):
print("Score:", score, "Doc:", doc)
if __name__ == "__main__":
main() I'll try and get a proper fix as well, but it'll probably have to be in
|
Made an issue here: huggingface/transformers#35228 |
I've also added
|
Thanks @tomaarsen . I am working on it |
Hey @tomaarsen, great, thanks for the quick response! It works now for |
Hi 👋 !
I noticed that with the newest version of Transformers, CrossEncoder seems to break. On the previous version it works fine, but on 4.47.0 I get the following error:
OSError: Can't load the model for 'cross-encoder/ms-marco-MiniLM-L-6-v2'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'cross-encoder/ms-marco-MiniLM-L-6-v2' is the correct path to a directory containing a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.
Environment
Using an M3 MacBook Pro.
Steps to reproduce
The following code gives the error:
NOTE: Downgrading to transformers==4.46.3 fixes it.
The text was updated successfully, but these errors were encountered: