Identification of a linear time-invariant dynamical system from partial observations is a fundamental problem in control theory. Particularly challenging are systems exhibiting long-term memory. A natural question is how learn such systems with non-asymptotic statistical rates depending on the inherent dimensionality (order) $d$ of the system, rather than on the possibly much larger memory length. We propose an algorithm that given a single trajectory of length $T$ with gaussian observation noise, learns the system with a near-optimal rate of $\widetilde O\left(\sqrt\frac{d}{T}\right)$ in $\mathcal{H}_2$ error, with only logarithmic, rather than polynomial dependence on memory length. We also give bounds under process noise and improved bounds for learning a realization of the system. Our algorithm is based on multi-scale low-rank approximation: SVD applied to Hankel matrices of geometrically increasing sizes. Our analysis relies on careful application of concentration bounds on the Fourier domain -- we give sharper concentration bounds for sample covariance of correlated inputs and for $\mathcal H_\infty$ norm estimation, which may be of independent interest.
翻译:从部分观测中确定线性时间差异动态系统是控制理论中的一个基本问题。 特别具有挑战性的是显示长期内存的系统。 一个自然的问题是,如何根据系统的内在维度(顺序)$d$,而不是可能大得多的内存长度,以非非非被动统计率学习这种系统,这取决于系统的内在维度(顺序)$d$,而不是可能大得多的内存长度。 我们建议一种算法,给一个长度为T$的单轨迹,加上毛利观测噪音,以接近最佳的速率来学习该系统。 我们的分析依赖于在四大域($mathcal{H ⁇ 2$) 中仔细应用集中圈 -- 我们只给出对内存长度的对数值的对数,而不是多数值依赖的多数值。 我们还在程序下发出噪音,改进了了解系统实现过程的界限。 我们的算法基于多尺度的低近似值: SVD 应用汉克尔基底缩缩阵,以几何级增长的大小。 我们的分析依赖于在四大域上仔细应用集中圈 -- 我们给出了精确的浓度, 也就是的浓度绑定点点的浓度,其中, 也就是输入的焦焦焦值可能是独立的焦焦焦焦焦调。