Deep learning algorithms vary depending on the underlying connection mechanism of nodes of them. They have various hyperparameters that are either set via specific algorithms or randomly chosen. Meanwhile, hyperparameters of deep learning algorithms have the potential to help enhance the performance of the machine learning tasks. In this paper, a tuning guideline is provided for researchers who cope with issues originated from hyperparameters of deep learning models. To that end, four types of deep learning algorithms are investigated in terms of tuning and data mining perspective. Further, common search methods of hyperparameters are evaluated on four deep learning algorithms. Normalization helps increase the performance of classification, according to the results of this study. The number of features has not contributed to the decline in the accuracy of deep learning algorithms. Even though high sparsity results in low accuracy, a uniform distribution is much more crucial to reach reliable results in terms of data mining.
翻译:深层学习算法因节点的基本连接机制而不同。 它们有各种超常参数, 或是通过特定算法设置的, 或是通过随机选择的。 同时, 深层学习算法的超参数有可能帮助提高机器学习任务的业绩。 在本文中, 为处理深层学习模型的超参数产生的问题的研究人员提供了调试指南。 为此, 从调试和数据挖掘角度对四种深层学习算法进行了调查。 此外, 以四种深层学习算法对超参数的共同搜索方法进行了评估。 根据这项研究的结果, 标准化有助于提高分类的性能。 特征的数量并没有导致深层学习算法的准确性下降。 尽管高度宽度导致低精度, 但统一分布对于在数据挖掘方面取得可靠结果更为关键。