Utilize este identificador para referenciar este registo: https://hdl.handle.net/1822/90504

Registo completo
Campo DCValorIdioma
dc.contributor.authorTorres, Helena R.por
dc.contributor.authorOliveira, Brunopor
dc.contributor.authorFonseca, Jaime C.por
dc.contributor.authorMorais, Pedro André Gonçalvespor
dc.contributor.authorVilaça, João L.por
dc.date.accessioned2024-04-03T11:33:56Z-
dc.date.available2024-04-03T11:33:56Z-
dc.date.issued2023-12-
dc.identifier.isbn9798350324471por
dc.identifier.issn38082637-
dc.identifier.urihttps://hdl.handle.net/1822/90504-
dc.description.abstractMedical image segmentation is a paramount task for several clinical applications, namely for the diagnosis of pathologies, for treatment planning, and for aiding image-guided surgeries. With the development of deep learning, Convolutional Neural Networks (CNN) have become the state-of-the-art for medical image segmentation. However, issues are still raised concerning the precise object boundary delineation, since traditional CNNs can produce non-smooth segmentations with boundary discontinuities. In this work, a U-shaped CNN architecture is proposed to generate both pixel-wise segmentation and probabilistic contour maps of the object to segment, in order to generate reliable segmentations at the object's boundaries. Moreover, since the segmentation and contour maps must be inherently related to each other, a dual consistency loss that relates the two outputs of the network is proposed. Thus, the network is enforced to consistently learn the segmentation and contour delineation tasks during the training. The proposed method was applied and validated on a public dataset of cardiac 3D ultrasound images of the left ventricle. The results obtained showed the good performance of the method and its applicability for the cardiac dataset, showing its potential to be used in clinical practice for medical image segmentation.Clinical Relevance-The proposed network with dual consistency loss scheme can improve the performance of state-of-the-art CNNs for medical image segmentation, proving its value to be applied for computer-aided diagnosis.por
dc.description.sponsorship- (undefined)por
dc.language.isoengpor
dc.publisherIEEEpor
dc.relationNORTE-01–0145-FEDER000059por
dc.relationNORTE-01-0145-FEDER-000045por
dc.relationinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F00319%2F2020/PTpor
dc.relationLASI-LA/P/0104/2020por
dc.relationinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F05549%2F2020/PTpor
dc.relationinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDP%2F05549%2F2020/PTpor
dc.relationCEECINST/00039/2021por
dc.relationinfo:eu-repo/grantAgreement/FCT/POR_NORTE/SFRH%2FBD%2F136670%2F2018/PTpor
dc.relationinfo:eu-repo/grantAgreement/FCT/POR_NORTE/SFRH%2FBD%2F136721%2F2018/PTpor
dc.relationCOVID/BD/154328/2023por
dc.rightsopenAccesspor
dc.titleDual consistency loss for contour-aware segmentation in medical imagespor
dc.typeconferencePaperpor
dc.peerreviewedyespor
dc.relation.publisherversionhttps://ieeexplore.ieee.org/document/10340931por
dc.date.updated2024-04-03T10:29:44Z-
dc.identifier.eissn2694-0604-
dc.identifier.doi10.1109/EMBC40787.2023.10340931por
dc.identifier.eisbn979-8-3503-2447-1-
dc.identifier.pmid38082637por
sdum.export.identifier14999-
sdum.journalProceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBSpor
sdum.conferencePublication2023 45th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC)por
dc.identifier.pmc38082637-
Aparece nas coleções:CAlg - Artigos em revistas internacionais / Papers in international journals
DEI - Artigos em atas de congressos internacionais

Ficheiros deste registo:
Ficheiro Descrição TamanhoFormato 
Dual_consistency_loss_for_contour-aware_segmentation_in_medical_images.pdf1,25 MBAdobe PDFVer/Abrir

Partilhe no FacebookPartilhe no TwitterPartilhe no DeliciousPartilhe no LinkedInPartilhe no DiggAdicionar ao Google BookmarksPartilhe no MySpacePartilhe no Orkut
Exporte no formato BibTex mendeley Exporte no formato Endnote Adicione ao seu ORCID