Aldrich-McKelvey scaling is a method for correcting differential item functioning in ordered rating scales of perceived ideological positions in surveys. In this collection of notes, I present four findings. First, I show that similarly to ordinary least squares, Aldrich-McKelvey scaling can be improved with the use of QR decomposition during the estimation stage. While in theory this might improve accuracy, in practice the main advantage is to retain respondents otherwise lost in the estimation stage. Second, I show that this method leads to a proof of an identification constraint of Aldrich-McKelvey scaling: a minimum of three external stimuli. Third, I show that the common motivation for Aldrich-McKelvey scaling that it is robust to heteroskedasticity as compared to taking the means does not hold up. A review of the literature of prediction aggregation shows taking the mean is equally as robust. However, Aldrich-McKelvey scaling remains robust to rationalization bias and is transparent in its assumptions. Finally, I show that the setup of Bayesian Aldrich-McKelvey Scaling and Aldrich-McKelvey scaling differ from each other in their parameterisation. This is not commonly acknowledged in the literature, and new users of these methods should be aware.
翻译:Aldrich-McKelvey 缩放是纠正不同项目功能的一种方法,在对调查中被认为的意识形态立场进行定级的评级等级中起作用。在收集的注释中,我提出四项结论。首先,我表明,与普通的最小平方类似,在估算阶段使用 QR 分解, Aldrich-McKelvey 的缩放也可以改进。虽然理论上这可以提高准确性,但在实践中,主要优势是保留在估算阶段中失去的受访者。第二,我表明,这一方法导致证明Aldrich-McKelvey 缩放的识别限制:最少有三种外部刺激。第三,我表明,Aldrich-McKelvey 缩放的通用动机与普通平方一样,与使用手段相比,对外振动性是强的。对预测汇总文献的回顾显示,其平均值也同样强健。然而,Aldrich-Mckvey 的缩放仍然有力,其假设是透明的。最后,我表明,Bayesian-Melving-defile 的每个用户的这些标准化和标准化的缩化方法不应被确认。