When analyzing modern machine learning algorithms, we may need to handle kernel density estimation (KDE) with intricate kernels that are not designed by the user and might even be irregular and asymmetric. To handle this emerging challenge, we provide a strong uniform consistency result with the $L^\infty$ convergence rate for KDE on Riemannian manifolds with Riemann integrable kernels (in the ambient Euclidean space). We also provide an $L^1$ consistency result for kernel density estimation on Riemannian manifolds with Lebesgue integrable kernels. The isotropic kernels considered in this paper are different from the kernels in the Vapnik-Chervonenkis class that are frequently considered in statistics society. We illustrate the difference when we apply them to estimate the probability density function. Moreover, we elaborate the delicate difference when the kernel is designed on the intrinsic manifold and on the ambient Euclidian space, both might be encountered in practice. At last, we prove the necessary and sufficient condition for an isotropic kernel to be Riemann integrable on a submanifold in the Euclidean space.
翻译:分析现代机器学习算法时, 我们可能需要处理内核密度估计( KDE) 的内核密度估计( KDE), 其内核是用户没有设计的, 甚至可能是非常规的和不对称的。 为了应对这一新出现的挑战, 我们提供与 KDE 在里格曼式多管和里格曼内核( 周围的欧几里德空间) 的内核联结率高度一致的结果 。 我们还需要提供 $1 $1 的内核密度估计( KDE ), 与 用户没有设计的、 甚至可能是非正常的和不对称的内核的内核 。 本文所考虑的内核核与统计社会经常考虑的Vapnik- Chervonenkis 类的内核是完全不同的。 我们用它们来估计概率密度函数时, 我们用它们来说明不同之处。 此外, 当内核是设计内核内核的内核内核内核和内核内核内核内核内核的内核内核内核内核内核内核内核内核内核内核的内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内,,,, 可能在实践中可能会遇到会遇到,,, 可能会遇到及内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内核内