Transfer learning has emerged as a powerful technique in many application problems, such as computer vision and natural language processing. However, this technique is largely ignored in application to genetic data analysis. In this paper, we combine transfer learning technique with a neural network based method(expectile neural networks). With transfer learning, instead of starting the learning process from scratch, we start from one task that have been learned when solving a different task. We leverage previous learnings and avoid starting from scratch to improve the model performance by passing information gained in different but related task. To demonstrate the performance, we run two real data sets. By using transfer learning algorithm, the performance of expectile neural networks is improved compared to expectile neural network without using transfer learning technique.
翻译:在许多应用问题(如计算机视觉和自然语言处理)中,转移学习已经成为一种强有力的技术,在计算机视觉和自然语言处理等许多应用问题中,这种技术已经成为一种强大的技术。然而,在基因数据分析的应用中,这种技术在很大程度上被忽略了。在本文中,我们把转移学习技术与基于神经网络的方法(预期神经网络)结合起来。通过转移学习,而不是从零开始学习过程,我们从一个在解决不同任务时学到的任务开始。我们利用以前的学习,避免从零开始,通过传递在不同但相关的任务中获得的信息来改进模型的性能。为了展示这一性能,我们运行了两个真实的数据集。通过使用转移学习算法,预期神经网络的性能与预期神经网络相比,不使用转移学习技术而得到改善。