We propose a deep autoencoder with graph topology inference and filtering to achieve compact representations of unorganized 3D point clouds in an unsupervised manner. The encoder of the proposed networks adopts similar architectures as in PointNet, which is a well-acknowledged method for supervised learning of 3D point clouds. The decoder of the proposed networks involves three novel modules: the folding module, the graph-topology-inference module, and the graph-filtering module. The folding module folds a canonical 2D lattice to the underlying surface of a 3D point cloud, achieving coarse reconstruction; the graph-topology-inference module learns a graph topology to represent pairwise relationships between 3D points; and the graph-filtering module designs graph filters based on the learnt graph topology to obtain the refined reconstruction. We further provide theoretical analyses of the proposed architecture. We provide an upper bound for the reconstruction loss and further show the superiority of graph smoothness over spatial smoothness as a prior to model 3D point clouds. In the experiments, we validate the proposed networks in three tasks, including reconstruction, visualization, and classification. The experimental results show that (1) the proposed networks outperform the state-of-the-art methods in various tasks, including reconstruction and transfer classification; (2) a graph topology can be inferred as auxiliary information without specific supervision on graph topology inference; and (3) graph filtering refines the reconstruction, leading to better performances.
翻译:我们提出一个带有图示表层推断和过滤器的深自动编码器,以不受监督的方式实现无组织 3D 点云的缩放。 提议网络的编码器采用与PointNet类似的结构, 这是监督学习 3D 点云的公认方法。 提议的网络的解码器包含三个新模块: 折叠模块、 图形- 地形- 推断模块和图形过滤模块。 折叠模块将2D 平整层折叠到 3D 点云的底部表面, 实现粗糙重建; 图形- 图表- 推断模块学习一个图形表层结构, 以3D 点云为主层结构; 图表- 图表- 模块设计图表过滤器, 以所学图表学为基础, 以获得精细化的重建。 我们还提供了对拟议架构的理论分析。 我们为重建损失提供了一个上层框, 并进一步显示图表在3D 点云模型之前的平滑度的精度的精度的精度。 在实验中, 图表- 图表- 图表- 图表- 图表- 结构 结构 结构 将 结构 显示 三个 的 结构 结构 结构 结构 结构 结构 的 的 包括 结构 结构,,, 分析,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,