Classification as a supervised learning concept is an important content in machine learning. It aims at categorizing a set of data into classes. There are several commonly-used classification methods nowadays such as k-nearest neighbors, random forest, and support vector machine. Each of them has its own pros and cons, and none of them is invincible for all kinds of problems. In this thesis, we focus on Quadratic Multiform Separation (QMS), a classification method recently proposed by Michael Fan et al. (2019). Its fresh concept, rich mathematical structure, and innovative definition of loss function set it apart from the existing classification methods. Inspired by QMS, we propose utilizing a gradient-based optimization method, Adam, to obtain a classifier that minimizes the QMS-specific loss function. In addition, we provide suggestions regarding model tuning through explorations of the relationships between hyperparameters and accuracies. Our empirical result shows that QMS performs as good as most classification methods in terms of accuracy. Its superior performance is almost comparable to those of gradient boosting algorithms that win massive machine learning competitions.
翻译:作为监督性学习概念的分类是机器学习的一个重要内容。 它旨在将一组数据分类为类别。 现在有几种常用的分类方法, 如 k- 近邻、 随机森林、 支持矢量机等 。 它们每个人都有自己的利弊, 没有一个对各种问题都不可战胜。 在这个论文中, 我们侧重于Quadratic 多形分离( QMS), 这是Michael Fan 等人( 2019年)最近提出的一种分类方法 。 它的新概念、 丰富的数学结构和损失函数的创新定义将它与现有的分类方法分开。 在QMS 的启发下, 我们提议使用一种基于梯度的优化方法, 亚当, 以获得一个能最大限度地减少QMS特定损失函数的分类器。 此外, 我们提供了关于通过探索超参数和加速度之间的关系来调整模型的建议。 我们的经验结果表明, QMS 表现为在准确性方面最优的分类方法。 它的优性表现几乎与那些赢得大规模机器学习竞赛的梯度加速算法相似 。