We investigate approximation guarantees provided by logistic regression for the fundamental problem of agnostic learning of homogeneous halfspaces. Previously, for a certain broad class of "well-behaved" distributions on the examples, Diakonikolas et al. (2020) proved an $\tilde{\Omega}(\textrm{OPT})$ lower bound, while Frei et al. (2021) proved an $\tilde{O}(\sqrt{\textrm{OPT}})$ upper bound, where $\textrm{OPT}$ denotes the best zero-one/misclassification risk of a homogeneous halfspace. In this paper, we close this gap by constructing a well-behaved distribution such that the global minimizer of the logistic risk over this distribution only achieves $\Omega(\sqrt{\textrm{OPT}})$ misclassification risk, matching the upper bound in (Frei et al., 2021). On the other hand, we also show that if we impose a radial-Lipschitzness condition in addition to well-behaved-ness on the distribution, logistic regression on a ball of bounded radius reaches $\tilde{O}(\textrm{OPT})$ misclassification risk. Our techniques also show for any well-behaved distribution, regardless of radial Lipschitzness, we can overcome the $\Omega(\sqrt{\textrm{OPT}})$ lower bound for logistic loss simply at the cost of one additional convex optimization step involving the hinge loss and attain $\tilde{O}(\textrm{OPT})$ misclassification risk. This two-step convex optimization algorithm is simpler than previous methods obtaining this guarantee, all of which require solving $O(\log(1/\textrm{OPT}))$ minimization problems.
翻译:我们调查了物流回归提供的近似保障, 其根本问题是: 对同质半空进行认知性学习。 之前, Diakonikolas 等人 (202020年) 证明了美元=trede=Omega} (textrm{OPT}), 而Frei 等人 (2021年) 证明了美元=tilde{O} (sqrt=textrm{OPT}) 美元上限, 其中$\ ttextrrm{OPT} 表示一个单一半空空空中最好的零一/ 错误分类风险。 在本文中, 我们缩小了这一差距, 通过构建一个良好端端端分配, 使全球范围内的物流风险最小化只达到$\mega(\ sqrtrt\ OPr\\ OPT} 美元=obrelationlationlationlationlationlationlational- dlickrlickrlation 。 对于(freal 美元=lickrlickr=lationrlickrl=lexlexlexl) 也显示了成本=lexlicklexlexl=lexlexl=l=l=lexl=l=l=l=l=l=l=l=l=l=l=l=ll=l=l=l=l=l=l=l=l=l=l=l=l=lll=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=ltxl=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=l=lttttltltttltltltltltltltltltltltlltltltltttttltltlt