Hide metadata

dc.date.accessioned2023-01-26T17:44:56Z
dc.date.available2023-01-26T17:44:56Z
dc.date.created2022-05-12T14:14:14Z
dc.date.issued2022
dc.identifier.citationKnuth, Franziska Hanna Grøndahl, Aurora Rosvoll Winter, René Torheim, Turid Katrine Gjerstad Negård, Anne Holmedal, Stein Harald Bakke, Kine Mari Meltzer, Sebastian Futsæther, Cecilia Marie Redalen, Kathrine . Semi-automatic tumor segmentation of rectal cancer based on functional magnetic resonance imaging. Physics and imaging in radiation oncology (PIRO). 2022, 22, 77-84
dc.identifier.urihttp://hdl.handle.net/10852/99287
dc.description.abstractBackground and purpose Tumor delineation is required both for radiotherapy planning and quantitative imaging biomarker purposes. It is a manual, time- and labor-intensive process prone to inter- and intraobserver variations. Semi or fully automatic segmentation could provide better efficiency and consistency. This study aimed to investigate the influence of including and combining functional with anatomical magnetic resonance imaging (MRI) sequences on the quality of automatic segmentations. Materials and methods T2-weighted (T2w), diffusion weighted, multi-echo T2*-weighted, and contrast enhanced dynamic multi-echo (DME) MR images of eighty-one patients with rectal cancer were used in the analysis. Four classical machine learning algorithms; adaptive boosting (ADA), linear and quadratic discriminant analysis and support vector machines, were trained for automatic segmentation of tumor and normal tissue using different combinations of the MR images as input, followed by semi-automatic morphological post-processing. Manual delineations from two experts served as ground truth. The Sørensen-Dice similarity coefficient (DICE) and mean symmetric surface distance (MSD) were used as performance metric in leave-one-out cross validation. Results Using T2w images alone, ADA outperformed the other algorithms, yielding a median per patient DICE of 0.67 and MSD of 3.6 mm. The performance improved when functional images were added and was highest for models based on either T2w and DME images (DICE: 0.72, MSD: 2.7 mm) or all four MRI sequences (DICE: 0.72, MSD: 2.5 mm). Conclusion Machine learning models using functional MRI, in particular DME, have the potential to improve automatic segmentation of rectal cancer relative to models using T2w MRI alone.
dc.languageEN
dc.rightsAttribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.titleSemi-automatic tumor segmentation of rectal cancer based on functional magnetic resonance imaging
dc.title.alternativeENEngelskEnglishSemi-automatic tumor segmentation of rectal cancer based on functional magnetic resonance imaging
dc.typeJournal article
dc.creator.authorKnuth, Franziska Hanna
dc.creator.authorGrøndahl, Aurora Rosvoll
dc.creator.authorWinter, René
dc.creator.authorTorheim, Turid Katrine Gjerstad
dc.creator.authorNegård, Anne
dc.creator.authorHolmedal, Stein Harald
dc.creator.authorBakke, Kine Mari
dc.creator.authorMeltzer, Sebastian
dc.creator.authorFutsæther, Cecilia Marie
dc.creator.authorRedalen, Kathrine
cristin.unitcode185,15,5,45
cristin.unitnameML Maskinlæring
cristin.ispublishedtrue
cristin.fulltextoriginal
cristin.qualitycode1
dc.identifier.cristin2023977
dc.identifier.bibliographiccitationinfo:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=Physics and imaging in radiation oncology (PIRO)&rft.volume=22&rft.spage=77&rft.date=2022
dc.identifier.jtitlePhysics and imaging in radiation oncology (PIRO)
dc.identifier.volume22
dc.identifier.startpage77
dc.identifier.endpage84
dc.identifier.doihttps://doi.org/10.1016/j.phro.2022.05.001
dc.type.documentTidsskriftartikkel
dc.type.peerreviewedPeer reviewed
dc.source.issn2405-6316
dc.type.versionPublishedVersion
dc.relation.projectHELSEMIDTNORGENTNU/30513
dc.relation.projectKF/198116-2018
dc.relation.projectHSØ/2013002
dc.relation.projectHSØ/2015048
dc.relation.projectHSØ/2016050


Files in this item

Appears in the following Collection

Hide metadata

Attribution 4.0 International
This item's license is: Attribution 4.0 International