We present an approach to design the grid searches for hyper-parameter optimization for recurrent neural architectures. The basis for this approach is the use of mutual information to analyze long distance dependencies (LDDs) within a dataset. We also report a set of experiments that demonstrate how using this approach, we obtain state-of-the-art results for DilatedRNNs across a range of benchmark datasets.
翻译:我们提出一种方法来设计对经常性神经结构进行超参数优化的网格搜索,其基础是利用相互信息在一个数据集内分析长距离依赖性(LDDs),我们还报告了一系列实验,这些实验表明我们如何使用这种方法,在一系列基准数据集中为DilatedRNS取得最新的最新结果。