Nerva: a Truly Sparse Implementation of Neural Networks

التفاصيل البيبلوغرافية
العنوان: Nerva: a Truly Sparse Implementation of Neural Networks
المؤلفون: Wesselink, Wieger, Grooten, Bram, Xiao, Qiao, de Campos, Cassio, Pechenizkiy, Mykola
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning
الوصف: We introduce Nerva, a fast neural network library under development in C++. It supports sparsity by using the sparse matrix operations of Intel's Math Kernel Library (MKL), which eliminates the need for binary masks. We show that Nerva significantly decreases training time and memory usage while reaching equivalent accuracy to PyTorch. We run static sparse experiments with an MLP on CIFAR-10. On high sparsity levels like $99\%$, the runtime is reduced by a factor of $4\times$ compared to a PyTorch model using masks. Similar to other popular frameworks such as PyTorch and Keras, Nerva offers a Python interface for users to work with.
Comment: The Nerva library is available at https://github.com/wiegerw/nerva
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2407.17437
رقم الانضمام: edsarx.2407.17437
قاعدة البيانات: arXiv