Despite the fact that different objects possess distinct class-specific features, they also usually share common patterns. This observation has been exploited partially in a recently proposed dictionary learning framework by separating the particularity and the commonality (COPAR). Inspired by this, we propose a novel method to explicitly and simultaneously learn a set of common patterns as well as class-specific features for classification with more intuitive constraints. Our dictionary learning framework is hence characterized by both a shared dictionary and particular (class-specific) dictionaries. For the shared dictionary, we enforce a low-rank constraint, i.e. claim that its spanning subspace should have low dimension and the coefficients corresponding to this dictionary should be similar. For the particular dictionaries, we impose on them the well-known constraints stated in the Fisher discrimination dictionary learning (FDDL). Further, we develop new fast and accurate algorithms to solve the subproblems in the learning step, accelerating its convergence. The said algorithms could also be applied to FDDL and its extensions. The efficiencies of these algorithms are theoretically and experimentally verified by comparing their complexities and running time with those of other well-known dictionary learning methods. Experimental results on widely used image datasets establish the advantages of our method over state-of-the-art dictionary learning methods.
翻译:尽管不同对象具有不同的类别特点,但它们通常也具有共同的模式。这种观察在最近提议的字典学习框架中被部分利用,将特性和共性(COPAR)区分开来(COPAR)。受这个启发,我们提出了一种新颖的方法,以明确和同时学习一套共同的模式和特定类别的特点,进行分类,并具有更直观的限制。因此,我们的字典学习框架具有共同的字典和特定(特定类别)词典的特征。对于共同的字典,我们实行低级别限制,即声称其跨越的子空间应具有低维度,与本字典相应的系数应相似。对于特定字典,我们向他们强加了渔业歧视词典学习(FDRSL)中众所周知的限制因素。此外,我们开发了新的快速和准确的算法,以解决学习步骤中的子问题,加速其趋同速度。上述算法还可以适用于FDRDRL及其扩展。这些算法的效率是理论上和实验性的,通过比较其复杂性和运行时间与其他广为人所知的字典学习方法的优势。