You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In short: multiprocessing.Process never works when not inside of __name__ == "__main__". I recognize that most programs should be using that line, but I'd rather not force it on my users.
If one of my users loads any model that only has a pytorch_model.bin, then it'll fail, e.g.:
RuntimeError:
An attempt has been made to start a new process before the
current process has finished its bootstrapping phase.
This probably means that you are not using fork to start your
child processes and you have forgotten to use the proper idiom
in the main module:
if __name__ == '__main__':
freeze_support()
...
The "freeze_support()" line can be omitted if the program
is not going to be frozen to produce an executable.
To fix this issue, refer to the "Safe importing of main module"
section in https://docs.python.org/3/library/multiprocessing.html
Edit: To prevent people experiencing errors, I've updated all cross-encoder models to safetensors. So you can't reproduce those anymore without specifying the revision.
Tom Aarsen
The text was updated successfully, but these errors were encountered:
Bug Overview
transformers
models fails if:pytorch_model.bin
, and__name__ == "__main__"
(Taken from #34966 (comment))
Details
In short:
multiprocessing.Process
never works when not inside of__name__ == "__main__"
. I recognize that most programs should be using that line, but I'd rather not force it on my users.If one of my users loads any model that only has a
pytorch_model.bin
, then it'll fail, e.g.:or
which internally call
or
All of these get:
Edit: To prevent people experiencing errors, I've updated all
cross-encoder
models to safetensors. So you can't reproduce those anymore without specifying therevision
.The text was updated successfully, but these errors were encountered: