Multi-Task Learning (MTL) has shown its importance at user products for fast training, data efficiency, reduced overfitting etc. MTL achieves it by sharing the network parameters and training a network for multiple tasks simultaneously. However, MTL does not provide the solution, if each task needs training from a different dataset. In order to solve the stated problem, we have proposed an architecture named TreeDNN along with it's training methodology. TreeDNN helps in training the model with multiple datasets simultaneously, where each branch of the tree may need a different training dataset. We have shown in the results that TreeDNN provides competitive performance with the advantage of reduced ROM requirement for parameter storage and increased responsiveness of the system by loading only specific branch at inference time.
翻译:多任务学习(MTL)在用户产品中显示了其在快速培训、数据效率、减少超装等用户产品中的重要性。MTL通过共享网络参数和同时培训多个任务网络来实现这一目标。然而,如果每个任务需要不同数据集的培训,MTL并不提供解决方案。为了解决上述问题,我们提出了一个名为TreetDNN的架构及其培训方法。TreaDNN帮助同时用多个数据集对模型进行培训,其中树的每个分支可能需要不同的培训数据集。我们从结果中可以看出,TreaDNN提供竞争性性能,其优点是参数存储的ROM要求减少,系统反应能力提高,在推论时只加装特定的分支。