This is a tutorial and survey paper on various methods for Sufficient Dimension Reduction (SDR). We cover these methods with both statistical high-dimensional regression perspective and machine learning approach for dimensionality reduction. We start with introducing inverse regression methods including Sliced Inverse Regression (SIR), Sliced Average Variance Estimation (SAVE), contour regression, directional regression, Principal Fitted Components (PFC), Likelihood Acquired Direction (LAD), and graphical regression. Then, we introduce forward regression methods including Principal Hessian Directions (pHd), Minimum Average Variance Estimation (MAVE), Conditional Variance Estimation (CVE), and deep SDR methods. Finally, we explain Kernel Dimension Reduction (KDR) both for supervised and unsupervised learning. We also show that supervised KDR and supervised PCA are equivalent.
翻译:这是一份关于充分减少维度的各种方法(SDR)的辅导和调查文件。 我们用统计高维回归视角和机器学习方法来涵盖这些方法, 首先是采用反回归方法, 包括斜反反反反反反反反反反反反反反反反反反反反反反反反反反反反反反反反反反反反平平均差异动、 平平平均差异偏差动动、 轮廓回归、 方向回归、 顶部合适反向、 亲近后向方向(LAD) 和图形回归。 然后, 我们引入了前向回归方法, 包括主要赫森方向(pHd)、 最低平均差异估计(MAVE)、 条件差异估计(CVE) 和 深特别提款权方法。 最后, 我们解释核心维度减少(KDR) 是受监督和不受监督的。 我们还表明, 受监督的KDR和受监督的CPA是等的。