Next-token prediction capacity: general upper bounds and a lower bound for transformers

التفاصيل البيبلوغرافية
العنوان: Next-token prediction capacity: general upper bounds and a lower bound for transformers
المؤلفون: Madden, Liam, Fox, Curtis, Thrampoulidis, Christos
سنة النشر: 2024
المجموعة: Computer Science
Mathematics
مصطلحات موضوعية: Computer Science - Machine Learning, Mathematics - Optimization and Control, 15A03, 26B35
الوصف: Given a sequence of tokens, such as words, the task of next-token prediction is to predict the next-token conditional probability distribution. Decoder-only transformers have become effective models for this task, but their properties are still not fully understood. In particular, the largest number of distinct context sequences that a decoder-only transformer can interpolate next-token distributions for has not been established. To fill this gap, we prove upper and lower bounds on this number, which are equal up to a multiplicative constant. We prove these bounds in the general setting where next-token distributions can be arbitrary as well as the empirical setting where they are calculated from a finite number of document sequences. Our lower bounds are for one-layer multi-head decoder-only transformers and our proofs highlight an important injectivity property satisfied by self-attention. Furthermore, we provide numerical evidence that the minimal number of parameters for memorization is sufficient for being able to train the model to the entropy lower bound.
Comment: V2: added a section axiomatizing the probabilistic assumptions of next-token prediction
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2405.13718
رقم الانضمام: edsarx.2405.13718
قاعدة البيانات: arXiv