Utilize este identificador para referenciar este registo: https://hdl.handle.net/1822/90525

Registo completo
Campo DCValorIdioma
dc.contributor.authorTorres, Helena R.por
dc.contributor.authorMorais, Pedro André Gonçalvespor
dc.contributor.authorFritze, Annepor
dc.contributor.authorOliveira, Brunopor
dc.contributor.authorVeloso, Fernandopor
dc.contributor.authorRudiger, Mariopor
dc.contributor.authorFonseca, Jaime C.por
dc.contributor.authorVilaça, João L.por
dc.date.accessioned2024-04-03T13:23:11Z-
dc.date.available2024-04-03T13:23:11Z-
dc.date.issued2022-09-08-
dc.identifier.isbn9781728127828por
dc.identifier.issn1557-170Xpor
dc.identifier.urihttps://hdl.handle.net/1822/90525-
dc.description.abstractCephalometric analysis is an important and routine task in the medical field to assess craniofacial development and to diagnose cranial deformities and midline facial abnormalities. The advance of 3D digital techniques potentiated the development of 3D cephalometry, which includes the localization of cephalometric landmarks in the 3D models. However, manual labeling is still applied, being a tedious and time-consuming task, highly prone to intra/inter-observer variability. In this paper, a framework to automatically locate cephalometric landmarks in 3D facial models is presented. The landmark detector is divided into two stages: (i) creation of 2D maps representative of the 3D model; and (ii) landmarks' detection through a regression convolutional neural network (CNN). In the first step, the 3D facial model is transformed to 2D maps retrieved from 3D shape descriptors. In the second stage, a CNN is used to estimate a probability map for each landmark using the 2D representations as input. The detection method was evaluated in three different datasets of 3D facial models, namely the Texas 3DFR, the BU3DFE, and the Bosphorus databases. An average distance error of 2.3, 3.0, and 3.2 mm were obtained for the landmarks evaluated on each dataset. The obtained results demonstrated the accuracy of the method in different 3D facial datasets with a performance competitive to the state-of-the-art methods, allowing to prove its versability to different 3D models. Clinical Relevance - Overall, the performance of the landmark detector demonstrated its potential to be used for 3D cephalometric analysis.por
dc.description.sponsorshipFCT - Fundação para a Ciência e a Tecnologia(LASI-LA/P/0104/2020)por
dc.language.isoengpor
dc.publisherIEEEpor
dc.relationNORTE-01-0145-FEDER000059por
dc.relationNORTE-01-0145-FEDER-024300por
dc.relationNORTE-01–0145-FEDER-000045por
dc.relationinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F00319%2F2020/PTpor
dc.relationLASI-LA/P/0104/2020por
dc.relationinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F05549%2F2020/PTpor
dc.relationinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDP%2F05549%2F2020/PTpor
dc.relationinfo:eu-repo/grantAgreement/FCT/POR_NORTE/SFRH%2FBD%2F136670%2F2018/PTpor
dc.relationinfo:eu-repo/grantAgreement/FCT/POR_NORTE/SFRH%2FBD%2F136721%2F2018/PTpor
dc.relationinfo:eu-repo/grantAgreement/FCT/POR_NORTE/SFRH%2FBD%2F131545%2F2017/PTpor
dc.rightsopenAccesspor
dc.title3D facial landmark localization for cephalometric analysispor
dc.typeconferencePaperpor
dc.peerreviewedyespor
dc.relation.publisherversionhttps://ieeexplore.ieee.org/document/9871184por
oaire.citationStartPage1016por
oaire.citationEndPage1019por
oaire.citationVolume2022-Julypor
dc.date.updated2024-04-03T10:42:31Z-
dc.identifier.eissn2694-0604-
dc.identifier.doi10.1109/EMBC48229.2022.9871184por
dc.identifier.eisbn978-1-7281-2782-8-
dc.identifier.pmid36083940por
sdum.export.identifier16000-
sdum.journalProceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBSpor
dc.identifier.pmc36083940-
Aparece nas coleções:CAlg - Artigos em revistas internacionais / Papers in international journals

Ficheiros deste registo:
Ficheiro Descrição TamanhoFormato 
3D_Facial_Landmark_Localization_for_cephalometric_analysis.pdf515,54 kBAdobe PDFVer/Abrir

Partilhe no FacebookPartilhe no TwitterPartilhe no DeliciousPartilhe no LinkedInPartilhe no DiggAdicionar ao Google BookmarksPartilhe no MySpacePartilhe no Orkut
Exporte no formato BibTex mendeley Exporte no formato Endnote Adicione ao seu ORCID