Transformer with its underlying attention mechanism and the ability to capture long-range dependencies makes it become a natural choice for unordered point cloud data. However, separated local regions from the general sampling architecture corrupt the structural information of the instances, and the inherent relationships between adjacent local regions lack exploration, while local structural information is crucial in a transformer-based 3D point cloud model. Therefore, in this paper, we propose a novel module named Local Context Propagation (LCP) to exploit the message passing between neighboring local regions and make their representations more informative and discriminative. More specifically, we use the overlap points of adjacent local regions (which statistically show to be prevalent) as intermediaries, then re-weight the features of these shared points from different local regions before passing them to the next layers. Inserting the LCP module between two transformer layers results in a significant improvement in network expressiveness. Finally, we design a flexible LCPFormer architecture equipped with the LCP module. The proposed method is applicable to different tasks and outperforms various transformer-based methods in benchmarks including 3D shape classification and dense prediction tasks such as 3D object detection and semantic segmentation. Code will be released for reproduction.
翻译:具有基本关注机制的变异器和捕捉长距离依赖性能力的变异器使该变异器成为非定序点云数据的一种自然选择。然而,与一般采样结构分离的局部区域腐蚀了事件的结构性信息,而相邻的当地区域之间的内在关系缺乏探索,而当地结构信息在基于变压器的3D点云模型中至关重要。因此,在本文件中,我们提议了一个名为“本地环境促进”的新模块,以利用相邻的当地区域之间传递的信息,使其表述更加丰富和具有歧视性。更具体地说,我们使用相邻的当地区域的重叠点(统计显示这些重叠点很普遍)作为中间点,然后从不同地方区域重新权衡这些共享点的特征,然后将其传递到下层。在两个变异器层之间插入LCP模块,可以大大改进网络的清晰度。最后,我们设计了一个灵活的LCPFormer架构,配备LCP模块。拟议方法适用于不同的任务,并超越各种基于变异器的方法,基准包括3D形状分类和密度预测任务,例如3D天体探测和再利用。代码将代码转换。