Report
LaB-GATr: geometric algebra transformers for large biomedical surface and volume meshes
العنوان: | LaB-GATr: geometric algebra transformers for large biomedical surface and volume meshes |
---|---|
المؤلفون: | Suk, Julian, Imre, Baris, Wolterink, Jelmer M. |
سنة النشر: | 2024 |
المجموعة: | Computer Science |
مصطلحات موضوعية: | Computer Science - Computer Vision and Pattern Recognition, Computer Science - Machine Learning |
الوصف: | Many anatomical structures can be described by surface or volume meshes. Machine learning is a promising tool to extract information from these 3D models. However, high-fidelity meshes often contain hundreds of thousands of vertices, which creates unique challenges in building deep neural network architectures. Furthermore, patient-specific meshes may not be canonically aligned which limits the generalisation of machine learning algorithms. We propose LaB-GATr, a transfomer neural network with geometric tokenisation that can effectively learn with large-scale (bio-)medical surface and volume meshes through sequence compression and interpolation. Our method extends the recently proposed geometric algebra transformer (GATr) and thus respects all Euclidean symmetries, i.e. rotation, translation and reflection, effectively mitigating the problem of canonical alignment between patients. LaB-GATr achieves state-of-the-art results on three tasks in cardiovascular hemodynamics modelling and neurodevelopmental phenotype prediction, featuring meshes of up to 200,000 vertices. Our results demonstrate that LaB-GATr is a powerful architecture for learning with high-fidelity meshes which has the potential to enable interesting downstream applications. Our implementation is publicly available. Comment: First published in "Medical Image Computing and Computer Assisted Intervention" (MICCAI), pp 185-195, 2024 by Springer Nature |
نوع الوثيقة: | Working Paper |
DOI: | 10.1007/978-3-031-72390-2_18 |
URL الوصول: | http://arxiv.org/abs/2403.07536 |
رقم الانضمام: | edsarx.2403.07536 |
قاعدة البيانات: | arXiv |
DOI: | 10.1007/978-3-031-72390-2_18 |
---|