Prediction is a central task of statistics and machine learning, yet many inferential settings provide only partial information, typically in the form of moment constraints or estimating equations. We develop a finite, fully Bayesian framework for propagating such partial information through predictive distributions. Building on de Finetti's representation theorem, we construct a curvature-adaptive version of exchangeable updating that operates directly under finite constraints, yielding an explicit discrete-Gaussian mixture that quantifies predictive uncertainty. The resulting finite-sample bounds depend on the smallest eigenvalue of the information-geometric Hessian, which measures the curvature and identification strength of the constraint manifold. This approach unifies empirical likelihood, Bayesian empirical likelihood, and generalized method-of-moments estimation within a common predictive geometry. On the operational side, it provides computable curvature-sensitive uncertainty bounds for constrained prediction; on the theoretical side, it recovers de Finetti's coherence, Doob's martingale convergence and local asymptotic normality as limiting cases of the same finite mechanism. Our framework thus offers a constructive bridge between partial information and full Bayesian prediction.
翻译:暂无翻译