In a multiple regression model where features (predictors) form an orthonormal basis, we prove that there exists a uniformly most powerful unbiased (UMPU) test for testing that the coefficient of a single feature is negative (or zero) versus positive. The test statistic used is the same as the coefficient t-test commonly reported by standard statistical software, and has the same null distribution. This result suggests that orthogonalizing features around the predictor of interest, prior to fitting the multiple regression, might be a way to increase power in single coefficient testing, compared to regressing on the raw, correlated features. This paper determines the conditions under which this is true, and argues that those conditions are fulfilled in a majority of applications.
翻译:暂无翻译