Convolutional dictionary learning (CDL), the problem of estimating shift-invariant templates from data, is typically conducted in the absence of a prior/structure on the templates. In data-scarce or low signal-to-noise ratio (SNR) regimes, learned templates overfit the data and lack smoothness, which can affect the predictive performance of downstream tasks. To address this limitation, we propose GPCDL, a convolutional dictionary learning framework that enforces priors on templates using Gaussian Processes (GPs). With the focus on smoothness, we show theoretically that imposing a GP prior is equivalent to Wiener filtering the learned templates, thereby suppressing high-frequency components and promoting smoothness. We show that the algorithm is a simple extension of the classical iteratively reweighted least squares algorithm, independent of the choice of GP kernels. This property allows one to experiment flexibly with different smoothness assumptions. Through simulation, we show that GPCDL learns smooth dictionaries with better accuracy than the unregularized alternative across a range of SNRs. Through an application to neural spiking data, we show that GPCDL learns a more accurate and visually-interpretable smooth dictionary, leading to superior predictive performance compared to non-regularized CDL, as well as parametric alternatives.
翻译:进化字典学习(CDL)是从数据中估算变异模板的问题,从数据中估算变换模板的问题,通常是在模板上没有事先/结构的情况下进行的。在数据偏差或低信号对噪音比(SNR)制度中,学习到的模板过重数据,缺乏顺畅性,这可能影响下游任务的预测性能。为解决这一限制,我们建议GPCDL(GPCDL),这是一个革命字典学习框架,用高山进程(GPPs)对模板执行前科。由于注重顺畅性,我们理论上显示,在理论上,实施GPs前科相当于Weener过滤所学的模板,从而压制高频组件和促进平滑性。我们表明,算法是传统的迭代再加权最小方算法的简单延伸,独立于GPC内核的选项。通过模拟,GPGDL学会比不正规的替代方法更准确,我们把SNRISD作为平稳的常规预测数据显示为平稳的,我们把SVD-CRV-L,我们把S-CRV-C-ID-ID-ID-SDV-LD-SD-S-ID-ID-ID-ID-S-S-SDVD-LVD-SD-S-SD-LVD-SDVD-S-S-SVD-SD-SDVD-S-S-S-SD-SDVDVDRVDVDVDVD-SD-SDVD-LD-SD-SDVDVDRV-S-S-SDVD-S-S-S-S-S-S-S-S-S-S-S-L-L-S-S-S-L-L-S-S-S-L-SD-SD-SVDVDVDVDVDVDL-S-S-S-S-L-L-S-S-S-S-S-S-S-SV-S-S-S-S-S-S-L-S-S-S-S-S-S-S-S-S-S-S-