We show that the One-way ANOVA and Tukey-Kramer (TK) tests agree on any sample with two groups. This result is based on a simple identity connecting the Fisher-Snedecor and studentized probabilistic distributions and is proven without any additional assumptions; in particular, the standard ANOVA assumptions (independence, normality, and homoscedasticity (INAH)) are not needed. In contrast, it is known that for a sample with k > 2 groups of observations, even under the INAH assumptions, with the same significance level $\alpha$, the above two tests may give opposite results: (i) ANOVA rejects its null hypothesis $H_0^{A}: \mu_1 = \ldots = \mu_k$, while the TK one, $H_0^{TK}(i,j): \mu_i = \mu_j$, is not rejected for any pair $i, j \in \{1, \ldots, k\}$; (ii) the TK test rejects $H_0^{TK}(i,j)$ for a pair $(i, j)$ (with $i \neq j$) while ANOVA does not reject $H_0^{A}$. We construct two large infinite pseudo-random families of samples of both types satisfying INAH: in case (i) for any $k \geq 3$ and in case (ii) for some larger $k$. Furthermore, in case (ii) ANOVA, being restricted to the pair of groups $(i,j)$, may reject equality $\mu_i = \mu_j$ with the same $\alpha$. This is an obvious contradiction, since $\mu_1 = \ldots = \mu_k$ implies $\mu_i = \mu_j$ for all $i, j \in \{1, \ldots, k\}.$ Similar contradictory examples are constructed for the Multivariable Linear Regression (MLR). However, for these constructions it seems difficult to verify the Gauss-Markov assumptions, which are standardly required for MLR.
翻译:我们显示单向 ANOVA 和 Tukey- Kramer (TK) 测试同意与两个组群的任何样本。 此结果基于将Fisher- snecentor 和学生化的概率分布连接在一起的简单身份, 且无需附加任何假设; 特别是, ANOVA 的标准假设( 独立、 正常度和同质性 (INAH) 并不需要。 相比之下, 即便根据INAH 的假设, 也存在k > 2组观察的样本, 并且具有同等价值的 $, 以上两个测试可能会产生相反的结果 :( 一) ANOVA拒绝其完全假设 $H_ 0A} : mu_ 1 =\ muk$, 而 标准 1, h_ 0 tk} (i,j) 对于这些( mu_i) = mum_ 美元 美元, 对于这些( i) 对于任何一对 美元, i, i, via, lex, li lex, k lex $; (i, k) a c) a crude rude suis) ax a, h=x, ax ax, ax ax, h) ax ax, ax, ax, ax, a do do do do do, ax ax, ax, ax, ax, ax, ax, ax, ax, axu, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, ax, a