Brain PET-MR attenuation correction with deep learning: method validation in adult and clinical paediatric data

Siti N. Yaakub

Research output: Working paper / PreprintPreprint

Abstract

Current methods for magnetic resonance-based positron emission tomography attenuation correction (PET-MR AC) are time consuming, and less able than computed tomography (CT)-based AC methods to capture inter-individual variability and skull abnormalities. Deep learning methods have been proposed to produce pseudo-CT from MR images, but these methods have not yet been evaluated in large clinical cohorts. Methods trained on healthy adult data may not work in clinical cohorts where skull morphometry may be abnormal, or in paediatric data where skulls tend to be thinner and smaller. Here, we train a convolutional neural network based on the U-Net to produce pseudo-CT for PET-MR AC. We trained our network on a mixed cohort of healthy adults and patients undergoing clinical PET scans for neurology investigations. We show that our method was able to produce pseudo-CT with mean absolute errors (MAE) of 100.4 21.3 HU compared to reference CT, with a Jaccard overlap coefficient of 0.73 0.07 in the skull masks. Linear attenuation maps based on our pseudo-CT (relative MAE = 8.4 2.1\%) were more accurate than those based on a well-performing multi-atlas-based AC method (relative MAE = 13.1 1.5\%) when compared with CT-based linear attenuation maps. We refined the trained network in a clinical paediatric cohort. MAE improved from 174.7 33.6 HU when using the existing network to 127.3 39.9 HU after transfer learning in the paediatric dataset, thus showing that transfer learning can improve pseudo-CT accuracy in paediatric data.
Original languageEnglish
PublisherarXiv
Number of pages10
DOIs
Publication statusPublished - 2 Dec 2022
Externally publishedYes

Fingerprint

Dive into the research topics of 'Brain PET-MR attenuation correction with deep learning: method validation in adult and clinical paediatric data'. Together they form a unique fingerprint.

Cite this