The important recent book by G. Schurz appreciates that the no-free-lunch theorems (NFL) have major implications for the problem of (meta) induction. Here I review the NFL theorems, emphasizing that they do not only concern the case where there is a uniform prior -- they prove that there are "as many priors" (loosely speaking) for which any induction algorithm $A$ out-generalizes some induction algorithm $B$ as vice-versa. Importantly though, in addition to the NFL theorems, there are many \textit{free lunch} theorems. In particular, the NFL theorems can only be used to compare the \textit{marginal} expected performance of an induction algorithm $A$ with the marginal expected performance of an induction algorithm $B$. There is a rich set of free lunches which instead concern the statistical correlations among the generalization errors of induction algorithms. As I describe, the meta-induction algorithms that Schurz advocate as a "solution to Hume's problem" are just an example of such a free lunch based on correlations among the generalization errors of induction algorithms. I end by pointing out that the prior that Schurz advocates, which is uniform over bit frequencies rather than bit patterns, is contradicted by thousands of experiments in statistical physics and by the great success of the maximum entropy procedure in inductive inference.
翻译:G. Schurz最近的重要著作《G. Schurz》认识到,无自由午餐理论(NFL)对(分子)上岗问题有重大影响。在这里,我审查了NFL理论,强调它们不仅涉及有统一的前科的案例,而且强调它们不仅涉及“许多前科”(粗略地说),任何上岗算法都“像许多前科”(许多前科)一样将一些上岗算法($A$)概括为反向的上岗算法($B$)。重要的是,除了NFL理论外,还有许多Textit{免费午餐理论。特别是NFL理论,NFL理论理论不仅能够用来比较上岗算法(textit{free lune)的预期性能,而且强调它们不仅能够用来比较上岗算法(textitrical)中的上行进法($A$$)的预期性能和上岗算法的预期性能业绩。有一套丰富的免费午餐,而不是引算法的统计性差。我所描述的是,Schurz鼓吹的元算算算算算算算算算法(She)是“在I lime limp list bitalbly list bital list list comml) commlateal list ex