We propose an approach based on function evaluations and Bayesian inference to extract higher-order differential information of objective functions {from a given ensemble of particles}. Pointwise evaluation $\{V(x^i)\}_i$ of some potential $V$ in an ensemble $\{x^i\}_i$ contains implicit information about first or higher order derivatives, which can be made explicit with little computational effort (ensemble-based gradient inference -- EGI). We suggest to use this information for the improvement of established ensemble-based numerical methods for optimization and sampling such as Consensus-based optimization and Langevin-based samplers. Numerical studies indicate that the augmented algorithms are often superior to their gradient-free variants, in particular the augmented methods help the ensembles to escape their initial domain, to explore multimodal, non-Gaussian settings and to speed up the collapse at the end of optimization dynamics.} The code for the numerical examples in this manuscript can be found in the paper's Github repository (https://github.com/MercuryBench/ensemble-based-gradient.git).
翻译:我们提出一种基于功能评估和贝叶斯推论的方法,以提取关于客观功能的更高层次的差别信息,{来自某一粒子的组合}。点度评价 $V(xi) <unk> i $i 美元,在合金中含有关于一阶或更高阶衍生物的隐含信息,这种信息可以通过很少的计算努力(基于共同基梯度推论 -- -- EGI)加以明确。我们建议利用这一信息改进基于共同点的既定数字方法,以优化和取样,如基于共识的优化和基于兰埃文的采样器。 数字研究表明,增强的算法往往优于其梯度的变式,特别是增强的方法帮助集合体摆脱其初始领域,探索多式联运和非高加索环境,并加快优化动力末端的崩溃速度。}这一手稿中的数字示例的代码可以在文件的Github存放处找到(https://github.com/MercuryBench/chemble-gregle-gast)。</s>