MeshFormer: High-resolution Mesh Segmentation with Graph Transformer

Loading...
Thumbnail Image
Date
2022
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association and John Wiley & Sons Ltd.
Abstract
Graph transformer has achieved remarkable success in graph-based segmentation tasks. Inspired by this success, we propose a novel method named MeshFormer for applying the graph transformer to the semantic segmentation of high-resolution meshes. The main challenges are the large data size, the massive model size, and the insufficient extraction of high-resolution semantic meanings. The large data or model size necessitates unacceptably extensive computational resources, and the insufficient semantic meanings lead to inaccurate segmentation results. MeshFormer addresses these three challenges with three components. First, a boundary-preserving simplification is introduced to reduce the data size while maintaining the critical high-resolution information in segmentation boundaries. Second, a Ricci flow-based clustering algorithm is presented for constructing hierarchical structures of meshes, replacing many convolutions layers for global support with only a few convolutions in hierarchy structures. In this way, the model size can be reduced to an acceptable range. Third, we design a graph transformer with cross-resolution convolutions, which extracts richer high-resolution semantic meanings and improves segmentation results over previous methods. Experiments show that MeshFormer achieves gains from 1.0% to 5.8% on artificial and real-world datasets.
Description

CCS Concepts: Computing methodologies ! Neural networks; Shape analysis

        
@article{
10.1111:cgf.14655
, journal = {Computer Graphics Forum}, title = {{
MeshFormer: High-resolution Mesh Segmentation with Graph Transformer
}}, author = {
Li, Yuan
and
He, Xiangyang
and
Jiang, Yankai
and
Liu, Huan
and
Tao, Yubo
and
Hai, Lin
}, year = {
2022
}, publisher = {
The Eurographics Association and John Wiley & Sons Ltd.
}, ISSN = {
1467-8659
}, DOI = {
10.1111/cgf.14655
} }
Citation
Collections