For deep learning, size is power. Massive neural nets trained on broad data for a spectrum of tasks are at the forefront of artificial intelligence. These foundation models or 'Jacks of All Trades' (JATs), when fine-tuned for downstream tasks, are gaining importance in driving deep learning advancements. However, environments with tight resource constraints, changing objectives and intentions, or varied task requirements, could limit the real-world utility of a singular JAT. Hence, in tandem with current trends towards building increasingly large JATs, this paper conducts an initial exploration into concepts underlying the creation of a diverse set of compact machine learning model sets. Composed of many smaller and specialized models, we formulate the Set of Sets to simultaneously fulfil many task settings and environmental conditions. A means to arrive at such a set tractably in one pass of a neuroevolutionary multitasking algorithm is presented for the first time, bringing us closer to models that are collectively 'Masters of All Trades'.
翻译:对于深层次的学习来说,尺寸是力量。在一系列任务的广泛数据方面受过培训的大规模神经网处于人工智能的最前沿。这些基础模型或“所有行业的杰克”(JATs)在为下游任务进行微调时,在推动深层次的学习进步方面越来越重要。然而,资源紧缺、目标和意图变化或任务要求各不相同的环境可能会限制单一的JAT在现实世界中的效用。因此,在目前建立规模越来越大的JAT的趋势的同时,本文件初步探索了创建一套多样化的集装集紧凑机器学习模型的基本概念。由许多规模较小的专门模型组成,我们制定一套集,以同时完成许多任务环境和环境条件。第一次提出了一种在神经革命性多任务算法中顺利达成这套方法,使我们更接近集体的“所有贸易的主人”模型。