CloudAttention: Efficient Multi-Scale Attention Scheme For 3D Point Cloud Learning

Saleh M, Wang Y, Navab N, Busam B, Tombari F (2022)


Publication Type: Conference contribution

Publication year: 2022

Journal

Publisher: Institute of Electrical and Electronics Engineers Inc.

Book Volume: 2022-October

Pages Range: 1986-1992

Conference Proceedings Title: IEEE International Conference on Intelligent Robots and Systems

Event location: Kyoto, JPN

ISBN: 9781665479271

DOI: 10.1109/IROS47612.2022.9982276

Abstract

Processing 3D data efficiently has always been a challenge. Spatial operations on large-scale point clouds, stored as sparse data, require extra cost. Attracted by the success of transformers, researchers are using multi-head attention for vision tasks. However, attention calculations in transformers come with quadratic complexity in the number of inputs and miss spatial intuition on sets like point clouds. We redesign set transformers in this work and incorporate them into a hierarchical framework for shape classification and part and scene segmentation. We propose our local attention unit, which captures features in a spatial neighborhood. We also compute efficient and dynamic global cross attentions by leveraging sampling and grouping at each iteration. Finally, to mitigate the non-heterogeneity of point clouds, we propose an efficient Multi-Scale Tokenization (MST), which extracts scale-invariant tokens for attention operations. The proposed hierarchical model achieves state-of-the-art shape classification in mean accuracy and yields results on par with the previous segmentation methods while requiring significantly fewer computations. Our proposed architecture predicts segmentation labels with around half the latency and parameter count of the previous most effi-cient method with comparable performance. The code is available at https://github.com/YigeWang-WHU/CloudAttention.

Involved external institutions

How to cite

APA:

Saleh, M., Wang, Y., Navab, N., Busam, B., & Tombari, F. (2022). CloudAttention: Efficient Multi-Scale Attention Scheme For 3D Point Cloud Learning. In IEEE International Conference on Intelligent Robots and Systems (pp. 1986-1992). Kyoto, JPN: Institute of Electrical and Electronics Engineers Inc..

MLA:

Saleh, Mahdi, et al. "CloudAttention: Efficient Multi-Scale Attention Scheme For 3D Point Cloud Learning." Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2022, Kyoto, JPN Institute of Electrical and Electronics Engineers Inc., 2022. 1986-1992.

BibTeX: Download