Overview Statistic: PDF-Downloads (blue) and Frontdoor-Views (gray)
  • Treffer 4 von 6
Zurück zur Trefferliste

Automated Virtual Reconstruction of Large Skull Defects using Statistical Shape Models and Generative Adversarial Networks

  • We present an automated method for extrapolating missing regions in label data of the skull in an anatomically plausible manner. The ultimate goal is to design patient-speci� c cranial implants for correcting large, arbitrarily shaped defects of the skull that can, for example, result from trauma of the head. Our approach utilizes a 3D statistical shape model (SSM) of the skull and a 2D generative adversarial network (GAN) that is trained in an unsupervised fashion from samples of healthy patients alone. By � tting the SSM to given input labels containing the skull defect, a First approximation of the healthy state of the patient is obtained. The GAN is then applied to further correct and smooth the output of the SSM in an anatomically plausible manner. Finally, the defect region is extracted using morphological operations and subtraction between the extrapolated healthy state of the patient and the defective input labels. The method is trained and evaluated based on data from the MICCAI 2020 AutoImplant challenge. It produces state-of-the art results on regularly shaped cut-outs that were present in the training and testing data of the challenge. Furthermore, due to unsupervised nature of the approach, the method generalizes well to previously unseen defects of varying shapes that were only present in the hidden test dataset.
Metadaten
Verfasserangaben:Pedro Pimentel, Angelika Szengel, Moritz Ehlke, Hans LameckerORCiD, Stefan ZachowORCiD, Laura Estacio, Christian Doenitz, Heiko Ramm
Herausgeber:in:Jianning Li, Jan Egger
Dokumentart:Artikel
Titel des übergeordneten Werkes (Englisch):Towards the Automatization of Cranial Implant Design in Cranioplasty
Band:12439
Erste Seite:16
Letzte Seite:27
Verlag:Springer International Publishing
Jahr der Erstveröffentlichung:2020
Bemerkungen:
Best Paper Award
DOI:https://doi.org/10.1007/978-3-030-64327-0_3
Accept ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.