For a multinomial distribution, suppose that we have prior knowledge of the sum of the probabilities of some categories. This allows us to construct a submodel in a full (i.e., no-restriction) model. Maximum likelihood estimation (MLE) under this submodel is expected to have better estimation efficiency than MLE under the full model. This article presents the asymptotic expansion of the risk of MLE with respect to Kullback--Leibler divergence for both the full model and submodel. The results reveal that, using the submodel, the reduction of the risk is quite small in some cases. Furthermore, when the sample size is small, the use of the subomodel can increase the risk.
翻译:对于多数值分布, 假设我们事先知道某些类别概率的总和。 这使我们能够在完整模型( 即无限制) 中建造一个子模型。 这个子模型下的最大可能性估计( MLE) 预计将比整个模型下 MLE 的最大可能性估计效率更高。 本条展示了全模型和子模型中MLE在Kullback- Leiber差异方面风险的无症状扩张。 结果显示, 使用该子模型, 在某些情况下风险的减少幅度很小。 此外, 当样本规模小时, 使用子模型可以增加风险 。