TY - JOUR
T1 - GCN-assisted attention-guided UNet for automated retinal OCT segmentation
AU - Oh, Dongsuk
AU - Moon, Jonghyeon
AU - Park, Kyoungtae
AU - Kim, Wonjun
AU - Yoo, Seungho
AU - Lee, Hyungwoo
AU - Yoo, Jiho
N1 - Publisher Copyright:
© 2024 Elsevier Ltd
PY - 2024/9
Y1 - 2024/9
N2 - With the increase in the aging population of many countries, the prevalence of neovascular age-related macular degeneration (nAMD) is expected to increase. Morphological parameters such as intraretinal fluid (IRF), subretinal fluid (SRF), subretinal hyperreflective material (SHRM), and pigment epithelium detachment (PED) of spectral-domain optical coherence tomography (SD-OCT) images are vital markers for proper treatment of nAMD, especially to get the information of treatment response to determine the proper treatment interval and switching of anti-vascular endothelial growth factor (VEGF) agents. For the precise evaluation of the change in nAMD lesions and patient-specific treatment, quantitative evaluation of the lesions in the OCT volume scans is necessary. However, manual segmentation requires many resources, and the number of studies of automatic segmentation is increasing rapidly. Improving automated segmentation performance in SD-OCT visual results requires long-range contextual inference of spatial information between retinal lesions and layers. This paper proposes a GAGUNet (graph convolution network (GCN)-assisted attention-guided UNet) model with a novel global reasoning module considering these points. The dataset used in the main experiment of this study underwent rigorous review by a retinal specialist from Konkuk University Hospital in Korea, contributing to both data preprocessing and validation to ensure a qualitative assessment. We conducted experiments on the RETOUCH dataset as well to demonstrate the scalability of the proposed model. Overall, our model demonstrates superior performance over the baseline models in both quantitative and qualitative evaluations.
AB - With the increase in the aging population of many countries, the prevalence of neovascular age-related macular degeneration (nAMD) is expected to increase. Morphological parameters such as intraretinal fluid (IRF), subretinal fluid (SRF), subretinal hyperreflective material (SHRM), and pigment epithelium detachment (PED) of spectral-domain optical coherence tomography (SD-OCT) images are vital markers for proper treatment of nAMD, especially to get the information of treatment response to determine the proper treatment interval and switching of anti-vascular endothelial growth factor (VEGF) agents. For the precise evaluation of the change in nAMD lesions and patient-specific treatment, quantitative evaluation of the lesions in the OCT volume scans is necessary. However, manual segmentation requires many resources, and the number of studies of automatic segmentation is increasing rapidly. Improving automated segmentation performance in SD-OCT visual results requires long-range contextual inference of spatial information between retinal lesions and layers. This paper proposes a GAGUNet (graph convolution network (GCN)-assisted attention-guided UNet) model with a novel global reasoning module considering these points. The dataset used in the main experiment of this study underwent rigorous review by a retinal specialist from Konkuk University Hospital in Korea, contributing to both data preprocessing and validation to ensure a qualitative assessment. We conducted experiments on the RETOUCH dataset as well to demonstrate the scalability of the proposed model. Overall, our model demonstrates superior performance over the baseline models in both quantitative and qualitative evaluations.
KW - Graph convolution network
KW - Medical image segmentation
KW - Multiscale skip connection
KW - Retinopathy
KW - Transformer
UR - http://www.scopus.com/inward/record.url?scp=85189088644&partnerID=8YFLogxK
U2 - 10.1016/j.eswa.2024.123620
DO - 10.1016/j.eswa.2024.123620
M3 - Article
AN - SCOPUS:85189088644
SN - 0957-4174
VL - 249
JO - Expert Systems with Applications
JF - Expert Systems with Applications
M1 - 123620
ER -