Feature extraction is an efficient approach for alleviating the issue of dimensionality in high-dimensional data. As a popular self-supervised learning method, contrastive learning has recently garnered considerable attention. In this study, we proposed a unified framework based on a new perspective of contrastive learning (CL) that is suitable for both unsupervised and supervised feature extraction. The proposed framework first constructed two CL graph for uniquely defining the positive and negative pairs. Subsequently, the projection matrix was determined by minimizing the contrastive loss function. In addition, the proposed framework considered both similar and dissimilar samples to unify unsupervised and supervised feature extraction. Moreover, we propose the three specific methods: unsupervised contrastive learning method, supervised contrastive learning method 1 ,and supervised contrastive learning method 2. Finally, the numerical experiments on five real datasets demonstrated the superior performance of the proposed framework in comparison to the existing methods.
翻译:地物提取是减轻高维数据维度问题的高效方法。作为一种受人欢迎的自我监督的学习方法,对比式学习最近引起了相当大的关注。在本研究中,我们提出了一个基于新的对比式学习观点的统一框架,它既适合未经监督和监督的特征提取方法,又适合未经监督和监督的特征提取方法。拟议框架首先构建了两个CL图,以独到地界定正对和负对。随后,通过尽量减少对比式损失功能确定了预测矩阵。此外,拟议框架认为相似和不同的样本可以统一未经监督和监督的特征提取方法。此外,我们提出了三种具体方法:未经监督的对比式学习方法、监督的对比式学习方法1,以及监督的对比式学习方法2。最后,五个真实数据集的数值实验表明,与现有方法相比,拟议框架的优异性表现。