Constant-specified and exponential concentration inequalities play an essential role in the finite-sample theory of machine learning and high-dimensional statistics area. We obtain sharper and constants-specified concentration inequalities for the sum of independent sub-Weibull random variables, which leads to a mixture of two tails: sub-Gaussian for small deviations and sub-Weibull for large deviations from the mean. These bounds are new and improve existing bounds with sharper constants. In addition, a new sub-Weibull parameter if the italic should be retained. Please check the whole text. is also proposed, which enables recovering the tight concentration inequality for a random variable (vector). For statistical applications, we give an $\ell_2$-error of estimated coefficients in negative binomial regressions when the heavy-tailed covariates are sub-Weibull distributed with sparse structures, which is a new result for negative binomial regressions. In applying random matrices, we derive non-asymptotic versions of Bai-Yin's theorem for sub-Weibull entries with exponential tail bounds. Finally, by demonstrating a sub-Weibull confidence region for a log-truncated Z-estimator without the second-moment condition, we discuss and define the sub-Weibull type robust estimator for independent observations $\{X_i\}_{i=1}^{n}$ without exponential-moment conditions.
翻译:常数和指数浓度不平等在机器学习和高维统计领域的有限样本理论中起着不可或缺的作用。 我们获得了独立子Weibull随机变量总和的更锐和常数指定的浓度不平等, 这导致两种尾巴的混合: 小偏差的Gaussian 子Gaussian 和大偏差的子Wibbull 。 这些边框是新的, 以更锐的常数改进了现有界限。 此外, 如果保留斜数, 新的子维布尔参数 。 请检查整个文本 。 也提议了这样可以恢复随机变量( Vector) 的严格浓度不平等 。 对于统计应用程序, 当重尾尾尾尾尾差的连尾变为微尾变, 我们给出了 el_ 2$- errorbor 的估算系数, 且不以正弦缩缩缩缩略图的次 Webus- bell_ ximal- deal- descrial- descrial- descriction the sual- subal- sublistryal- subal- subtraction- subal- sub- se- subliction- se- subbb- zal- se- riction- riction- riction- riction- riction- riction- riction- sub- subs- riction- riction- riction- riction- subbbb- sub- subbbbb- riction- riction- riction- riction- riction- sublogmental- se- sub- subsal- se- sub- sub- sub- sub- sub- riction- suction- riction- sub- subb- sub- sub- sub- sub- sub- sub- sub- sub- sub- sub- sub- rib- ric- sub- sub- sub-