Self-normalized processes arise naturally in many statistical tasks. While self-normalized concentration has been extensively studied for scalar-valued processes, there is less work on multidimensional processes outside of the sub-Gaussian setting. In this work, we construct a general, self-normalized inequality for $\mathbb{R}^d$-valued processes that satisfy a simple yet broad "sub-$\psi$" tail condition, which generalizes assumptions based on cumulant generating functions. From this general inequality, we derive an upper law of the iterated logarithm for sub-$\psi$ vector-valued processes, which is tight up to small constants. We demonstrate applications in prototypical statistical tasks, such as parameter estimation in online linear regression and auto-regressive modeling, and bounded mean estimation via a new (multivariate) empirical Bernstein concentration inequality.
翻译:暂无翻译