Rising concerns about privacy and anonymity preservation of deep learning models have facilitated research in data-free learning (DFL). For the first time, we identify that for data-scarce tasks like Sketch-Based Image Retrieval (SBIR), where the difficulty in acquiring paired photos and hand-drawn sketches limits data-dependent cross-modal learning algorithms, DFL can prove to be a much more practical paradigm. We thus propose Data-Free (DF)-SBIR, where, unlike existing DFL problems, pre-trained, single-modality classification models have to be leveraged to learn a cross-modal metric-space for retrieval without access to any training data. The widespread availability of pre-trained classification models, along with the difficulty in acquiring paired photo-sketch datasets for SBIR justify the practicality of this setting. We present a methodology for DF-SBIR, which can leverage knowledge from models independently trained to perform classification on photos and sketches. We evaluate our model on the Sketchy, TU-Berlin, and QuickDraw benchmarks, designing a variety of baselines based on state-of-the-art DFL literature, and observe that our method surpasses all of them by significant margins. Our method also achieves mAPs competitive with data-dependent approaches, all the while requiring no training data. Implementation is available at \url{https://github.com/abhrac/data-free-sbir}.
翻译:对深层学习模式的隐私和匿名维护的关切日益引起,这促进了对无数据学习(DFL)的研究。我们第一次发现,对于诸如Strach-Based图像检索(SBIR)等数据残缺的任务,DFL难以获得配对照片和手绘草图限制了数据依赖的跨模式学习算法,DFL可以证明DF-SBIR是一个更加实用的范例,因此,我们提议无数据(DF)-SBIR,这与现有的DFL问题不同,必须利用预先培训的单一模式分类模型学习跨模式的检索空间,而无需获得任何培训数据。 广泛提供预先培训的分类模型,同时难以为SBIR获取配对的相对照片和牵线素数据集,这证明这种环境的实用性。 我们为DF-SBIR提出了一个方法,它可以利用独立培训模型的知识对照片和草图进行分类。我们在Skettchy、TU-Berlin和QuickDrowt 等模型上学习一个跨模式,同时设计了我们所有现有数据库基准,同时用我们现有的数据库/MLFLA-S-S-S-SBS-S-S-S-S-S-S-S-SD-S-S-S-S-SD-S-S-S-S-S-SD-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S</s>