Multi-task learning (MTL) has emerged as a promising topic of machine learning in recent years, aiming to enhance the performance of numerous related learning tasks by exploiting beneficial information. During the training phase, most of the existing multi-task learning models concentrate entirely on the target task data and ignore the non-target task data contained in the target tasks. To address this issue, Universum data, that do not correspond to any class of a classification problem, may be used as prior knowledge in the training model. This study looks at the challenge of multi-task learning using Universum data to employ non-target task data, which leads to better performance. It proposes a multi-task twin support vector machine with Universum data (UMTSVM) and provides two approaches to its solution. The first approach takes into account the dual formulation of UMTSVM and tries to solve a quadratic programming problem. The second approach formulates a least-squares version of UMTSVM and refers to it as LS-UMTSVM to further increase the generalization performance. The solution of the two primal problems in LS-UMTSVM is simplified to solving just two systems of linear equations, resulting in an incredibly simple and quick approach. Numerical experiments on several popular multi-task data sets and medical data sets demonstrate the efficiency of the proposed methods.
翻译:多任务学习(MTL)近年来已成为一个很有希望的机器学习主题,目的是通过利用有益的信息,提高许多相关学习任务的绩效。在培训阶段,大多数现有的多任务学习模式都完全集中于目标任务数据,忽视目标任务中所包含的非目标任务数据。为解决这一问题,大学数据(与分类问题的任何类别不相符)可用作培训模式的先期知识。本研究研究了利用Universum数据进行多任务学习的挑战,以使用非目标任务数据提高绩效。它建议用Universum数据(UMTSVM)建立多任务双向支持矢量机,并提供两种解决办法。第一种办法考虑到UMTSVM的双重配制,并试图解决四分立的编程问题。第二种办法为UMTSMM的最小版本,并称之为LS-UTS-UVM,以进一步提高通用性数据绩效。在LS-S-MTS的两种初等级双级双级双级支持矢量级实验方法的解决方案,即SUM-MTS-M-S-S-S-S-S-S-Silentalal-Syalalalalalalalalal-Systemal-Systemal-Sypal-Sypal-Sypal-Sypal-Sypal-Sypal-Syal-Sypaldaldal-dal-dal-dal-dal-dal-dal-dal-dal-smal-smaldaldaldal-dal-dal-smaldal-smal-smal-smal-dal-sgy-al-al-s-s-sm-al-al-smal-al-T-ssssssssssssm-smal-smal-smal-smal-smal-smal-smal-al-al-smal-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-mod-