This paper is concerned with computationally efficient learning of homogeneous sparse halfspaces in $\mathbb{R}^d$ under noise. Though recent works have established attribute-efficient learning algorithms under various types of label noise (e.g. bounded noise), it remains an open question when and how $s$-sparse halfspaces can be efficiently learned under the challenging malicious noise model, where an adversary may corrupt both the unlabeled examples and the labels. We answer this question in the affirmative by designing a computationally efficient active learning algorithm with near-optimal label complexity of $\tilde{O}\big({s \log^4 \frac d \epsilon} \big)$ and noise tolerance $\eta = \Omega(\epsilon)$, where $\epsilon \in (0, 1)$ is the target error rate, under the assumption that the distribution over (uncorrupted) unlabeled examples is isotropic log-concave. Our algorithm can be straightforwardly tailored to the passive learning setting, and we show that the sample complexity is $\tilde{O}\big({\frac 1 \epsilon s^2 \log^5 d} \big)$ which also enjoys the attribute efficiency. Our main techniques include attribute-efficient paradigms for instance reweighting and for empirical risk minimization, and a new analysis of uniform concentration for unbounded data -- all of them crucially take the structure of the underlying halfspace into account.
翻译:本文关注在噪音下以$mathbb{R ⁇ d$ 以计算有效的方式学习同质稀有的半空。 尽管最近的工作在各种标签噪音(例如受封的噪音)下建立了属性效率学习算法( ), 但它仍然是一个未决问题, 在挑战性的恶意噪音模式下, $\\ sprealy半空空间可以有效学习, 对手可能会腐蚀未贴标签的例子和标签。 我们肯定地回答这个问题, 方法是设计一个计算高效的有效主动学习算法, 贴近最佳的标签复杂性$\ tilde{O ⁇ big( { log_ 2\ frac d\ epsilon}\ big) 和噪音容忍度( $\ eta =\ \ ometa (\ epsilon) $ =\ obreaquality, $\\\ ligistrical recrial restial restitual res rial ligistrical ral latitual ral rations) lax ( listrital rial rial rial ligistr listral rial rial rial dx) rial rial rial ligligligle dx) rial rial ligivaltime dx * * *我们 liglexxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx