We consider the regression problem where the dependence of the response Y on a set of predictors X is fully captured by the regression function E(Y | X)=g(B'X), for an unknown function g and low rank parameter B matrix. We combine neural networks with sufficient dimension reduction in order to remove the limitation of small p and n of the latter. We show in simulations that the proposed estimator is on par with competing sufficient dimension reduction methods in small p and n settings, such as minimum average variance estimation and conditional variance estimation. Among those, it is the only computationally applicable in large p and n problems.
翻译:我们考虑了回归问题,因为一个未知的函数g和低级参数B矩阵,Y对一组预测器X的响应依赖性完全被回归函数E(Y) ⁇ X=g(B'X) 充分吸收。我们把神经网络与足够的维度减少结合起来,以便消除小p和小n的限制。我们在模拟中显示,拟议的估计值与小p和小n环境中的足够维度减少方法相匹配,如最小平均差异估计和有条件差异估计。其中,这是唯一适用于大p和小n问题的计算方法。