KLI

Fully automated condyle segmentation using 3D convolutional neural networks

Metadata Downloads
Abstract
The aim of this study was to develop an auto-segmentation algorithm for mandibular condyle using the 3D U-Net and perform a stress test to determine the optimal dataset size for achieving clinically acceptable accuracy. 234 cone-beam computed tomography images of mandibular condyles were acquired from 117 subjects from two institutions, which were manually segmented to generate the ground truth. Semantic segmentation was performed using basic 3D U-Net and a cascaded 3D U-Net. A stress test was performed using different sets of condylar images as the training, validation, and test datasets. Relative accuracy was evaluated using dice similarity coefficients (DSCs) and Hausdorff distance (HD). In the five stages, the DSC ranged 0.886–0.922 and 0.912–0.932 for basic 3D U-Net and cascaded 3D U-Net, respectively; the HD ranged 2.557–3.099 and 2.452–2.600 for basic 3D U-Net and cascaded 3D U-Net, respectively. Stage V (largest data from two institutions) exhibited the highest DSC of 0.922 ± 0.021 and 0.932 ± 0.023 for basic 3D U-Net and cascaded 3D U-Net, respectively. Stage IV (200 samples from two institutions) had a lower performance than stage III (162 samples from one institution). Our results show that fully automated segmentation of mandibular condyles is possible using 3D U-Net algorithms, and the segmentation accuracy increases as training data increases.
Author(s)
Nayansi JhaTaehun KimSungwon HamSeung-Hak BaekSang-Jin SungYoon-Ji KimNamkug Kim
Issued Date
2022
Type
Article
Keyword
AlgorithmsAutomationCone-Beam Computed TomographyExercise testsHuman beingsMandibleNeural networks (Computer science)
DOI
10.1038/s41598-022-24164-y
URI
https://oak.ulsan.ac.kr/handle/2021.oak/15107
Publisher
SCIENTIFIC REPORTS
Language
영어
ISSN
2045-2322
Citation Volume
12
Citation Number
1
Citation Start Page
1
Citation End Page
8
Appears in Collections:
Medicine > Nursing
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.