BlockDialect: Block-wise Fine-grained Mixed Format Quantization for Energy-Efficient LLM Inference

التفاصيل البيبلوغرافية
العنوان: BlockDialect: Block-wise Fine-grained Mixed Format Quantization for Energy-Efficient LLM Inference
المؤلفون: Jang, Wonsuk, Tambe, Thierry
سنة النشر: 2025
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language, Computer Science - Machine Learning
الوصف: The rapidly increasing size of large language models (LLMs) presents significant challenges in memory usage and computational costs. Quantizing both weights and activations can address these issues, with hardware-supported fine-grained scaling emerging as a promising solution to mitigate outliers. However, existing methods struggle to capture nuanced block data distributions. We propose BlockDialect, a block-wise fine-grained mixed format technique that assigns a per-block optimal number format from a formatbook for better data representation. Additionally, we introduce DialectFP4, a formatbook of FP4 variants (akin to dialects) that adapt to diverse data distributions. To leverage this efficiently, we propose a two-stage approach for online DialectFP4 activation quantization. Importantly, DialectFP4 ensures energy efficiency by selecting representable values as scaled integers compatible with low-precision integer arithmetic. BlockDialect achieves 10.78% (7.48%) accuracy gain on the LLaMA3-8B (LLaMA2-7B) model compared to MXFP4 format with lower bit usage per data, while being only 5.45% (2.69%) below full precision even when quantizing full-path matrix multiplication. Focusing on how to represent over how to scale, our work presents a promising path for energy-efficient LLM inference.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2501.01144
رقم الانضمام: edsarx.2501.01144
قاعدة البيانات: arXiv