You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We currently don't have plans to implement silu in ACL.
KleidiAI is better suited to accelerate the type of workloads you find in these models than ACL. Please have a look at this article discussing LLM acceleration on mobile CPUs.
Alternatively you are welcome to submit a patch contributing to ACL. Please see our contribution guide for more information on how to add a new operator
Output of 'strings libarm_compute.so | grep arm_compute_version':
arm_compute_version=v24.04 Build options: {'Werror': '1', 'build_dir': '//acl/build', 'debug': '0', 'neon': '1', 'opencl': '0', 'os': 'linux', 'openmp': '1', 'cppthreads': '0', 'arch': 'armv8.2-a', 'multi_isa': '1', 'fixed_format_kernels': '1', 'build': 'native'} Git hash=b'4fda7a803eaadf00ba36bd532481a33c18952089'
Platform:
Neoverse N2
Operating System:
Ubuntu 22.04
Problem description:
gemm+silu fused operator is not supported in ACL
LLM use case: llama like models use silu activation with gemm.
Similar issue
#1083
The text was updated successfully, but these errors were encountered: