Replies: 1 comment
-
u have to write your own python code to load whisper in pytorch npu |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In China, Ascend NPU is the second choice after Nvidia GPU and has been adpoted by many companies, such as Alibaba, ByteDance, Meituan, etc. Huawei officially released an adapter called torch_npu, to adapt PyTorch on Ascend NPU. torch_npu is user friendly to developers, so that we can still enjoy the same PyTorch experience that we accustomed to today.
Since the NPU is supported in Transformers and Accelerate, Is there any way to deploy whisper on NPU?
Reference: https://github.com/huggingface/transformers/issues/22600
Beta Was this translation helpful? Give feedback.
All reactions