We propose an unsupervised technique for implicit parameterization of data manifolds. In our approach, the data is assumed to belong to a lower dimensional manifold in a higher dimensional space, and the data points are viewed as the endpoints of the trajectories originating outside the manifold. Under this assumption, the data manifold is an attractive manifold of a dynamical system to be estimated. We parameterize such a dynamical system with a residual neural network and propose a spectral localization technique to ensure it is locally attractive in the vicinity of data. We also present initialization and additional regularization of the proposed residual layers. % that we call dissipative bottlenecks. We mention the importance of the considered problem for the tasks of reinforcement learning and support our discussion with examples demonstrating the performance of the proposed layers in denoising and generative tasks.
翻译:我们建议了一种隐含数据元参数化的不受监督的技术。 在我们的方法中,我们假设数据属于一个更高维空间的低维元体,数据点被视为来自元体外的轨道的终点。在这个假设中,数据元是一个动态系统的有吸引力的元体,有待估计。我们用一个残余神经网络将这种动态系统参数化,并提议一种光谱本地化技术,以确保它在数据附近具有当地吸引力。我们还介绍了拟议的残余层的初始化和额外的正规化。我们称之为消散瓶颈的%。我们提到,所考虑的问题对于强化学习任务的重要性,并支持我们的讨论,举例说明了拟议层在分解和基因化任务方面的表现。