In this paper, we propose a novel method to learn internal feature representation models that are \textit{compatible} with previously learned ones. Compatible features enable for direct comparison of old and new learned features, allowing them to be used interchangeably over time. This eliminates the need for visual search systems to extract new features for all previously seen images in the gallery-set when sequentially upgrading the representation model. Extracting new features is typically quite expensive or infeasible in the case of very large gallery-sets and/or real time systems (i.e., face-recognition systems, social networks, life-long learning systems, robotics and surveillance systems). Our approach, called Compatible Representations via Stationarity (CoReS), achieves compatibility by encouraging stationarity to the learned representation model without relying on previously learned models. Stationarity allows features' statistical properties not to change under time shift so that the current learned features are inter-operable with the old ones. We evaluate single and sequential multi-model upgrading in growing large-scale training datasets and we show that our method improves the state-of-the-art in achieving compatible features by a large margin. In particular, upgrading ten times with training data taken from CASIA-WebFace and evaluating in Labeled Face in the Wild (LFW), we obtain a 49\% increase in measuring the average number of times compatibility is achieved, which is a 544\% relative improvement over previous state-of-the-art.
翻译:在本文中,我们提出一种新的方法来学习内部特征代表模型,这些模型与以往所学的模型相容。兼容的特征可以直接比较新老和新学到的特征,从而可以随时互换使用。这就消除了视觉搜索系统的需求,以便在按顺序更新展示模型时,为席集中所有先前看到的照片提取新的特征。在大型的席集和/或实时系统(即脸识别系统、社交网络、终身学习系统、机器人和监测系统)中,提取新的特征通常非常昂贵或不可行。兼容性功能可以直接比较旧和新学到的特征。我们称为“CoRES”的方法,通过静态(CERES),鼓励在不依赖以往所学模型的情况下,对所学的演示模型进行静性分析,从而避免时间变化,使当前所学特征与旧的特征相互兼容。我们评价的是大型培训数据集中单一和连续的多模型升级(即脸谱系统识别系统、终身学习系统、长期学习系统、机器人和监测系统),我们显示我们的方法改进了现状的状态,通过稳定(WA-al-al-al-al-al-al-al-lax-la-lax-lax-lax a a lax lax lax lax a lax lax a lax lax lax lax lax lax lax lax lax lax lax la la la la la lax lax lax lax la la la lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax a lax lax lax lax lax lax lax ex lax lax ex la la lax lax lax a ex ex lax lax ex ex ex ex la la lax la la la la