Breast Ultra-Sound image segmentation: an optimization approach based on super-pixels and high-level descriptors
Affiliation auteurs | !!!! Error affiliation !!!! |
Titre | Breast Ultra-Sound image segmentation: an optimization approach based on super-pixels and high-level descriptors |
Type de publication | Conference Paper |
Year of Publication | 2015 |
Auteurs | Massich J, Lemaitre G, Marti J, Meriaudeau F |
Editor | Meriaudeau F, Aubreton O |
Conference Name | TWELFTH INTERNATIONAL CONFERENCE ON QUALITY CONTROL BY ARTIFICIAL VISION |
Publisher | Le2i; CNRS; Univ Bourgogne; IUT Le Creusot Ctr Univ Condorcet; Conseil Reg Bourgogne |
Conference Location | 1000 20TH ST, PO BOX 10, BELLINGHAM, WA 98227-0010 USA |
ISBN Number | 978-1-62841-699-2 |
Mots-clés | BI-RADS lexicon, Breast Ultra-Sound, Graph-Cuts, Machine-Learning based Segmentation, Optimization based Segmentation |
Résumé | Breast cancer is the second most common cancer and the leading cause of cancer death among women. Medical imaging has become an indispensable tool for its diagnosis and follow up. During the last decade, the medical community has promoted to incorporate Ultra-Sound (US) screening as part of the standard routine. The main reason for using US imaging is its capability to differentiate benign from malignant masses, when compared to other imaging techniques. The increasing usage of US imaging encourages the development of Computer Aided Diagnosis (CAD) systems applied to Breast Ultra-Sound (BUS) images. However accurate delineations of the lesions and structures of the breast are essential for CAD systems in order to extract information needed to perform diagnosis. This article proposes a highly modular and flexible framework for segmenting lesions and tissues present in BUS images. The proposal takes advantage of optimization strategies using super-pixels and high-level descriptors, which are analogous to the visual cues used by radiologists. Qualitative and quantitative results are provided stating a performance within the range of the state-of-the-art. |
DOI | 10.1117/12.2182843 |