You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I use Autoawq to quantize Llama3.2 and Llama 3.1, and it appears problem NotImplementedError: Cannot copy out of meta tensor; no data! Please use torch.nn.Module.to_empty() instead of torch.nn.Module.to() when moving module from meta to a different device.
and I try change self._model.to(device) to self._model.to_empty(device),but it does not work
Traceback (most recent call last):
。...
File "anaconda3/envs/lmeval/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1333, in convert
raise NotImplementedError(
NotImplementedError: Cannot copy out of meta tensor; no data! Please use torch.nn.Module.to_empty() instead of torch.nn.Module.to() when moving module from meta to a different device.
The text was updated successfully, but these errors were encountered:
I use Autoawq to quantize Llama3.2 and Llama 3.1, and it appears problem
NotImplementedError: Cannot copy out of meta tensor; no data! Please use torch.nn.Module.to_empty() instead of torch.nn.Module.to() when moving module from meta to a different device.
and I try change
self._model.to(device)
toself._model.to_empty(device)
,but it does not workThe text was updated successfully, but these errors were encountered: