Mixability has been shown to be a powerful tool to obtain algorithms with optimal regret. However, the resulting methods often suffer from high computational complexity which has reduced their practical applicability. For example, in the case of multiclass logistic regression, the aggregating forecaster (Foster et al. (2018)) achieves a regret of $O(\log(Bn))$ whereas Online Newton Step achieves $O(e^B\log(n))$ obtaining a double exponential gain in $B$ (a bound on the norm of comparative functions). However, this high statistical performance is at the price of a prohibitive computational complexity $O(n^{37})$.
翻译:事实证明,混合性是取得最遗憾的算法的有力工具,然而,所产生的方法往往具有很高的计算复杂性,因而降低了其实际适用性,例如,在多级后勤回归的情况下,总预测器(Foster等人(2018年))对美元(美元(log(Bn)))感到遗憾,而在线牛顿步骤则获得美元(e ⁇ B\log(n))美元(按比较功能标准)的双倍指数增长。 然而,这种高统计性表现是以令人望而却步的计算复杂性(O(n ⁇ 37})美元(美元)为代价的。