Petersen Graph Multi-Orientation Based Multi-Scale Ternary Pattern (PGMO-MSTP): An Efficient Descriptor for Texture and Material Recognition
Affiliation auteurs | !!!! Error affiliation !!!! |
Titre | Petersen Graph Multi-Orientation Based Multi-Scale Ternary Pattern (PGMO-MSTP): An Efficient Descriptor for Texture and Material Recognition |
Type de publication | Journal Article |
Year of Publication | 2021 |
Auteurs | Khadiri IEl, Merabet YEl, Tarawneh AS, Ruichek Y, Chetverikov D, Touahni R, Hassanat AB |
Journal | IEEE TRANSACTIONS ON IMAGE PROCESSING |
Volume | 30 |
Pagination | 4571-4586 |
Type of Article | Article |
ISSN | 1057-7149 |
Mots-clés | Deep learning, LGS, LTP, Petersen graph, Texture classification, wilcoxon signed rank test |
Résumé | Classifying and modeling texture images, especially those with significant rotation, illumination, scale, and viewpoint variations, is a hot topic in the computer vision field. Inspired by local graph structure (LGS), local ternary patterns (LTP), and their variants, this paper proposes a novel image feature descriptor for texture and material classification, which we call Petersen Graph Multi-Orientation based Multi-Scale Ternary Pattern (PGMO-MSTP). PGMO-MSTP is a histogram representation that efficiently encodes the joint information within an image across feature and scale spaces, exploiting the concepts of both LTP-like and LGS-like descriptors, in order to overcome the shortcomings of these approaches. We first designed two single-scale horizontal and vertical Petersen Graph-based Ternary Pattern descriptors ( PGT Ph and PGT Pv). The essence of PGT Ph and PGT Pv is to encode each 5 x 5 image patch, extending the ideas of the LTP and LGS concepts, according to relationships between pixels sampled in a variety of spatial arrangements (i.e., up, down, left, and right) of Petersen graphshaped oriented sampling structures. The histograms obtained from the single-scale descriptors PGT Ph and PGT Pv are then combined, in order to build the effective multi-scale PGMOMSTP model. Extensive experiments are conducted on sixteen challenging texture data sets, demonstrating that PGMO-MSTP can outperform state-of-the-art handcrafted texture descriptors and deep learning-based feature extraction approaches. Moreover, a statistical comparison based on the Wilcoxon signed rank test demonstrates that PGMO-MSTP performed the best over all tested data sets. |
DOI | 10.1109/TIP.2021.3070188 |