In high dimensions, most machine learning method perform fragile even there are a little outliers. To address this, we hope to introduce a new method with the base learner, such as Bayesian regression or stochastic gradient descent to solve the problem of the vulnerability in the model. Because the mini-batch gradient descent allows for a more robust convergence than the batch gradient descent, we work a method with the mini-batch gradient descent, called Mini-Batch Gradient Descent with Trimming (MBGDT). Our method show state-of-art performance and have greater robustness than several baselines when we apply our method in designed dataset.
翻译:在高维度方面,大多数机器学习方法都非常脆弱,即使有小离子体也存在一些。 为了解决这个问题,我们希望与基础学习者一起引入一种新的方法,如贝耶斯回归或随机梯度下降,以解决模型中的脆弱程度问题。 由于微型批量梯度下降使得与分批梯度下降相比更加紧密的趋同,因此我们用一种方法处理微型批量梯度梯度下降,即称为微型袋状梯度梯子(MBGDT ) 。 我们的方法显示最先进的性能,并且比在设计数据集中应用我们的方法时的基线更强。