Academic Journal

Unsupervised domain adaptation teacher–student network for retinal vessel segmentation via full-resolution refined model

التفاصيل البيبلوغرافية
العنوان: Unsupervised domain adaptation teacher–student network for retinal vessel segmentation via full-resolution refined model
المؤلفون: Kejuan Yue, Lixin Zhan, Zheng Wang
المصدر: Scientific Reports, Vol 15, Iss 1, Pp 1-13 (2025)
بيانات النشر: Nature Portfolio, 2025.
سنة النشر: 2025
المجموعة: LCC:Medicine
LCC:Science
مصطلحات موضوعية: Retinal vessel segmentation, Domain adaptation, Teacher–student network, Full-resolution, Medicine, Science
الوصف: Abstract Retinal blood vessels are the only blood vessels in the human body that can be observed non-invasively. Changes in vessel morphology are closely associated with hypertension, diabetes, cardiovascular disease and other systemic diseases, and computers can help doctors identify these changes by automatically segmenting blood vessels in fundus images. If we train a highly accurate segmentation model on one dataset (source domain) and apply it to another dataset (target domain) with a different data distribution, the segmentation accuracy will drop sharply, which is called the domain shift problem. This paper proposes a novel unsupervised domain adaptation method to address this problem. It uses a teacher–student framework to generate pseudo labels for the target domain image, and trains the student network with a combination of source domain loss and domain adaptation loss; finally, the weights of the teacher network are updated from the exponential moving average of the student network and used for the target domain segmentation. We reconstructed the encoder and decoder of the network into a full-resolution refined model by computing the training loss at multiple semantic levels and multiple label resolutions. We validated our method on two publicly available datasets DRIVE and STARE. From STARE to DRIVE, the accuracy, sensitivity, and specificity are 0.9633, 0.8616,and 0.9733, respectively. From DRIVE to STARE, the accuracy, sensitivity, and specificity are 0.9687, 0.8470, and 0.9785, respectively. Our method outperforms most state-of-the-art unsupervised methods. Compared with domain adaptation methods, our method also has the best F1 score (0.8053) from STARE to DRIVE and a competitive F1 score (0.8001) from DRIVE to STARE.
نوع الوثيقة: article
وصف الملف: electronic resource
اللغة: English
تدمد: 2045-2322
Relation: https://doaj.org/toc/2045-2322
DOI: 10.1038/s41598-024-83018-x
URL الوصول: https://doaj.org/article/2d1f94a2823e43398c05d7005d3cd453
رقم الانضمام: edsdoj.2d1f94a2823e43398c05d7005d3cd453
قاعدة البيانات: Directory of Open Access Journals
الوصف
تدمد:20452322
DOI:10.1038/s41598-024-83018-x