A common challenge in large-scale supervised learning, is how to exploit new incremental data to a pre-trained model, without re-training the model from scratch. Motivated by this problem, we revisit the canonical problem of dynamic least-squares regression (LSR), where the goal is to learn a linear model over incremental training data. In this setup, data and labels $(\mathbf{A}^{(t)}, \mathbf{b}^{(t)}) \in \mathbb{R}^{t \times d}\times \mathbb{R}^t$ evolve in an online fashion ($t\gg d$), and the goal is to efficiently maintain an (approximate) solution to $\min_{\mathbf{x}^{(t)}} \| \mathbf{A}^{(t)} \mathbf{x}^{(t)} - \mathbf{b}^{(t)} \|_2$ for all $t\in [T]$. Our main result is a dynamic data structure which maintains an arbitrarily small constant approximate solution to dynamic LSR with amortized update time $O(d^{1+o(1)})$, almost matching the running time of the static (sketching-based) solution. By contrast, for exact (or even $1/\mathrm{poly}(n)$-accuracy) solutions, we show a separation between the static and dynamic settings, namely, that dynamic LSR requires $\Omega(d^{2-o(1)})$ amortized update time under the OMv Conjecture (Henzinger et al., STOC'15). Our data structure is conceptually simple, easy to implement, and fast both in theory and practice, as corroborated by experiments over both synthetic and real-world datasets.
翻译:大型监管学习的一个共同挑战, 是如何将新的递增数据用于预培训模式, 而不从零开始重新训练模型。 受这一问题的驱使, 我们重新研究动态最小方回归( LSR) 的典型问题, 目标是通过递增培训数据学习线性模型。 在此设置中, 数据和标签 $(\ mathbf{A} (t) },\ mathbf{b} (b) } (t),\ mathf{ b} (t) ) 将新的递增数据应用于预培训模式, 而不是从头到尾再训练模式。 (rät) 将新的递增数据( t\\ gg d d d d) 。 我们的主要结果( port) 将动态数据保存为 $\ min\ mathb{x{x{A} (t) mathbf{x} (t} (t) mathrealf} (t) lax- ladeal laudal lad droom laudal lad drode lady lautal lady laut lax) a lax lady dal lax lax lady lax laut laut laut laut laut laut laut laut laut lautt ( laut) a lax al_ lax) lax lax al_ lad d laut lad d lad lax lax lad d lax lax lax lax lax lax lax lax lax lax a lax lax lax lax lad d d d d d d d lad d lad d dal d lad d lad d d d dald d d d d d d d d d d dal lad d d d d d d d dal dal d d d d dald dal lad dal lad lad ladaldaldald lax)