Skip to content
forked from spcl/QuaRot

Code for QuaRot, an end-to-end 4-bit inference of large language models.

License

Notifications You must be signed in to change notification settings

StudyingShao/QuaRot

 
 

Repository files navigation

Your ImageQuaRot: Outlier-Free 4-Bit Inference in Rotated LLMs

This repository contains the code for QuaRot: Outlier-Free 4-Bit Inference in Rotated LLMs.

Abstract

We introduce QuaRot, a new Quantization scheme based on Rotations, which is able to quantize LLMs end-to-end, including all weights, activations, and KV cache in 4 bits. QuaRot rotates LLMs in a way that removes outliers from the hidden state without changing the output, making quantization easier. This computational invariance is applied to the hidden state (residual) of the LLM, as well as to the activations of the feed-forward components, aspects of the attention mechanism and to the KV cache. The result is a quantized model where all matrix multiplications are performed in 4-bits, without any channels identified for retention in higher precision. Our quantized LLaMa2-70B model has losses of at most 0.29 WikiText perplexity and retains 99% of the zero-shot performance.

Your Image

Usage

Compile the QuaRot kernels using the following commands:

git clone https://github.com/spcl/QuaRot.git
cd QuaRot
pip install -e .  # or pip install .

For simulation results, check fake_quant directory.

Star History

Star History Chart

Citation

The full citation is

@article{ashkboos2024quarot,
  title={QuaRot: Outlier-Free 4-Bit Inference in Rotated LLMs},
  author={Ashkboos, Saleh and Mohtashami, Amirkeivan and Croci, Maximilian L and Li, Bo and Jaggi, Martin and Alistarh, Dan and Hoefler, Torsten and Hensman, James},
  journal={arXiv preprint arXiv:2404.00456},
  year={2024}
}

About

Code for QuaRot, an end-to-end 4-bit inference of large language models.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 84.9%
  • Cuda 12.4%
  • C++ 2.6%
  • Other 0.1%