We present Multi-Scale Label Dependence Relation Networks (MSDN), a novel approach to multi-label classification (MLC) using 1-dimensional convolution kernels to learn label dependencies at multi-scale. Modern multi-label classifiers have been adopting recurrent neural networks (RNNs) as a memory structure to capture and exploit label dependency relations. The RNN-based MLC models however tend to introduce a very large number of parameters that may cause under-/over-fitting problems. The proposed method uses the 1-dimensional convolutional neural network (1D-CNN) to serve the same purpose in a more efficient manner. By training a model with multiple kernel sizes, the method is able to learn the dependency relations among labels at multiple scales, while it uses a drastically smaller number of parameters. With public benchmark datasets, we demonstrate that our model can achieve better accuracies with much smaller number of model parameters compared to RNN-based MLC models.
翻译:我们提出了多级标签依赖关系网络(MSDN),这是多级标签分类的一种新办法,它使用1维进化内核来学习多级标签依赖关系。现代多标签分类者一直采用经常性神经网络(RNN)作为记忆结构来捕捉和利用标签依赖关系。但是,以RNN为基础的刚果解放运动模型往往引入大量可能造成低级/超级问题的参数。拟议方法使用1维革命神经网络(1D-CNN)来以更有效率的方式为同一目的服务。通过培训多级内核规模模型,该方法能够学习多级标签之间的依赖关系,同时使用非常小的参数。通过公共基准数据集,我们证明我们的模型可以比基于RNN的模型更小得多的模型参数实现更好的扩展。