We propose a novel, fully nonparametric approach for the multi-task learning, the Multi-task Highly Adaptive Lasso (MT-HAL). MT-HAL simultaneously learns features, samples and task associations important for the common model, while imposing a shared sparse structure among similar tasks. Given multiple tasks, our approach automatically finds a sparse sharing structure. The proposed MTL algorithm attains a powerful dimension-free convergence rate of $o_p(n^{-1/4})$ or better. We show that MT-HAL outperforms sparsity-based MTL competitors across a wide range of simulation studies, including settings with nonlinear and linear relationships, varying levels of sparsity and task correlations, and different numbers of covariates and sample size.
翻译:我们为多任务学习提出了一种新的、完全非参数性的全方位方法,即多任务高度适应性拉索(MT-HAL)。MT-HAL同时学习对共同模式很重要的特点、样本和任务协会,同时在类似任务之间强加一种共同的稀疏结构。考虑到多种任务,我们的方法会自动发现一个稀疏的共享结构。拟议的MTL算法达到一个强大的无维-无维趋同率$_p(n ⁇ -1/4}美元或更高。我们显示,MT-HAL在广泛的模拟研究中超越基于MTL的广度竞争者,包括具有非线性和线性关系、不同程度的宽度和任务相关性以及不同数目的共变和样本大小。