The IBOSS approach proposed by Wang et al. (2019) selects the most informative subset of n points. It assumes that the ordinary least squares method is used and requires that the number of variables, p, is not large. However, in many practical problems, p is very large and penalty-based model fitting methods such as LASSO is used. We study the big data problems, in which both n and p are large. In the first part, we focus on reduction in data points. We develop theoretical results showing that the IBOSS type of approach can be applicable to penalty-based regressions such as LASSO. In the second part, we consider the situations where p is extremely large. We propose a two-step approach that involves first reducing the number of variables and then reducing the number of data points. Two separate algorithms are developed, whose performances are studied through extensive simulation studies. Compared to existing methods including well-known split-and-conquer approach, the proposed methods enjoy advantages in terms of estimation accuracy, prediction accuracy, and computation time.
翻译:暂无翻译