Most existing convolutional dictionary learning (CDL) algorithms are based on batch learning, where the dictionary filters and the convolutional sparse representations are optimized in an alternating manner using a training dataset. When large training datasets are used, batch CDL algorithms become prohibitively memory-intensive. An online-learning technique is used to reduce the memory requirements of CDL by optimizing the dictionary incrementally after finding the sparse representations of each training sample. Nevertheless, learning large dictionaries using the existing online CDL (OCDL) algorithms remains highly computationally expensive. In this paper, we present a novel approximate OCDL method that incorporates sparse decomposition of the training samples. The resulting optimization problems are addressed using the alternating direction method of multipliers. Extensive experimental evaluations using several image datasets show that the proposed method substantially reduces computational costs while preserving the effectiveness of the state-of-the-art OCDL algorithms.
翻译:现有大多数革命字典学习算法都基于批量学习,在批量学习的基础上,使用培训数据集对字典过滤器和变数稀少的表示法进行优化,以交替方式优化培训数据集。当使用大型培训数据集时,批量的CDL算法变得令人望而却步的记忆密集度极高。在找到每个培训样本的稀疏表示法后,通过在线学习技术来减少CDL的记忆要求,从而逐步优化词典。然而,使用现有在线CDL(OCDL)算法学习大型词典仍然在计算上非常昂贵。在本文中,我们介绍了一种新颖的OCDL方法,其中纳入了培训样本的稀有分解。由此产生的优化问题使用乘数的交替方向方法来解决。使用若干图像数据集进行的广泛实验评估表明,拟议的方法在保持最新技术的OCDL算法的有效性的同时,大大降低了计算成本。