A survey of low-bit large language models: Basics, systems, and algorithms

  • Ruihao Gong
  • , Yifu Ding
  • , Zining Wang
  • , Chengtao Lv
  • , Xingyu Zheng
  • , Jinyang Du
  • , Yang Yong
  • , Shiqiao Gu
  • , Haotong Qin
  • , Jinyang Guo
  • , Dahua Lin
  • , Michele Magno
  • , Xianglong Liu*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Large language models (LLMs) have achieved remarkable advancements in natural language processing, showcasing exceptional performance across various tasks. However, the expensive memory and computational requirements present significant challenges for their practical deployment. Low-bit quantization has emerged as a critical approach to mitigate these challenges by reducing the bit-width of model parameters, activations, and gradients, thus decreasing memory usage and computational demands. This paper presents a comprehensive survey of low-bit quantization methods tailored for LLMs, covering the fundamental principles, system implementations, and algorithmic strategies. An overview of basic concepts and new data formats specific to low-bit LLMs is first introduced, followed by a review of frameworks and systems that facilitate low-bit LLMs across various hardware platforms. Then, we categorize and analyze techniques and toolkits for efficient low-bit training and inference of LLMs. Finally, we conclude with a discussion of future trends and potential advancements of low-bit LLMs. Our systematic overview from basic, system, and algorithm perspectives can offer valuable insights and guidelines for future works to enhance the efficiency and applicability of LLMs through low-bit quantization.

Original languageEnglish
Article number107856
JournalNeural Networks
Volume192
DOIs
StatePublished - Dec 2025

Keywords

  • Algorithm
  • Large language model
  • Low-bit
  • Quantization
  • System

Fingerprint

Dive into the research topics of 'A survey of low-bit large language models: Basics, systems, and algorithms'. Together they form a unique fingerprint.

Cite this