Constructing good representations is critical for learning complex tasks in a sample efficient manner. In the context of meta-learning, representations can be constructed from common patterns of previously seen tasks so that a future task can be learned quickly. While recent works show the benefit of subspace-based representations, such results are limited to linear-regression tasks. This work explores a more general class of nonlinear tasks with applications ranging from binary classification, generalized linear models and neural nets. We prove that subspace-based representations can be learned in a sample-efficient manner and provably benefit future tasks in terms of sample complexity. Numerical results verify the theoretical predictions in classification and neural-network regression tasks.
翻译:在元学习方面,可以通过以前所看到的任务的共同模式来建立代表性,从而能够迅速了解未来的任务。虽然最近的工程显示以空间为基础的次级表述的好处,但这种结果仅限于线性回归任务。这项工作探索了非线性任务的更一般性类别,其应用范围包括二元分类、一般线性模型和神经网。我们证明,可以以抽样有效的方式学习以空间为基础的子表述,从抽样复杂性方面可以明显地使未来的任务受益。数字结果核查了分类和神经-网络回归任务的理论预测。