Photo Credit: Kontekbrothers
AI-enhanced FAF improves detection and analysis of features of degeneration in retinal images, which may help monitor, predict progression in, and treat IRDs.
Researchers in Europe reported in the International Journal of Retina and Vitreous that an experimental AI tool that improves the ability of fundus autofluorescence (FAF) to detect and analyze features of degeneration in retinal images may potentially help physicians monitor, predict progression, and more effectively treat various inherited retinal diseases (IRDs).
FAF Imaging in Diagnosing and Monitoring Specific IRDs
FAF noninvasively detects naturally occurring fluorophore molecules that absorb and emit light of specific wavelengths, and that can indicate pathologic changes such as retinal pigment epithelium atrophy or lipofuscin deposits.
Manually identifying and segmenting the features of degeneration requires time and expertise and is subjective, making them infeasible for large-scale, routine use. Studies have shown that AI has improved the FAF process by harnessing deep learning to quickly identify and segment individual IRDs such as retinitis pigmentosa, Stargardt disease, and choroideremia.
Machine Learning Expands FAF Image Analysis to Various IRDs
“We have conducted, to our knowledge, the largest quantitative cross-sectional and longitudinal analysis of FAF features across a diverse range of IRDs in a real-world dataset, enabled by our novel automatic segmentation AI model, AIRDetect,” William Woof, PhD, and colleagues wrote.
The researchers developed AIRDetect, a deep learning model that automatically identifies and segments relevant features from FAF images in various IRD phenotypes.
They used AIRDetect to analyze images from patients with clinical and molecularly confirmed IRD who underwent 55-degree, 488 nm blue-FAF on the Heidelberg Spectralis imaging platform (Heidelberg Engineering, Heidelberg, Germany) at Moorfields Eye Hospital and Royal Liverpool Hospital in the United Kingdom, between 2004 and 2019. They extracted the patients’ genotypes from the Genetics database of Moorfields Eye Hospital and exported the patients’ images from the Heyex database.
Overall, 45,749 FAF images covering 170 genes from 3,606 patients with IRDs were automatically segmented using AIRDetect.
The authors examined five FAF features: optic disc, relative hypo-autofluorescence (hypo-AF), hyper autofluorescence (hyper-AF), perimacular ring of increased signal (ring), and vessels (Table). Human graders manually recorded the features in a subset of patients using a grading protocol to train the AIRDetect AI model. They then applied AIRDetect to the entire imaging dataset.
They analyzed quantitative FAF imaging features cross-sectionally by gene and age and longitudinally to determine rates of progression. They validated AIRDetect feature segmentation and detection with Dice similarity coefficient score and precision/recall, respectively (Table).
Model-grader Dice scores were 0.86 for disc, 0.72 for hypo-AF, 0.69 for hyper-AF, 0.68 for ring, and 0.65 for vessels. The researchers found that:
- The genes with the greatest hypo-AF areas were CHM, ABCC6, ABCA4, RDH12, and RPE65, with mean per-patient areas 41.5, 30.0, 21.9, 21.4, and 15.1 mm2, respectively.
- The genes with the greatest hyper-AF areas were BEST1, CDH23, RDH12, MYO7A, and NR2E3, with mean areas of 0.49, 0.45, 0.44, 0.39, and 0.34 mm2, respectively.
- The genes with the greatest ring areas were CDH23, NR2E3, CRX, EYS, and MYO7A, with mean areas of 3.63, 3.32, 2.84, 2.39, and 2.16 mm2, respectively.
- Vessel density was highest in EFEMP1, BEST1, TIMP3, RS1, and PRPH2, at 10.6%, 10.3%, 9.8%, 9.7%, and 8.9%, respectively, and was lower in retinitis pigmentosa and Leber congenital amaurosis genes.
- Longitudinal analysis of decreasing ring area in four retinitis pigmentosa genes (RPGR, USH2A, RHO, EYS) showed that EYS was the fastest progressor at -0.18 mm2/year.
Promising Results, but Further Validation Needed
The researchers acknowledged that variations in phenotype and image quality limited these gene associations. They look forward to using AIRDetect to validate further clinical results and to identify new potential associations between feature patterns and genes or variants. “We plan to develop an IRD FAF image quality assessment model in the future, which should help to improve the consistency of our segmented masks and reduce noise in our analysis,” they wrote. “The diverse nature of IRD-associated pathologies might make AIRDetect useful to improve robustness for segmentation of FAF imaging for other non-IRD conditions.”