Optimization of conflicting functions is of paramount importance in decision making, and real world applications frequently involve data that is uncertain or unknown, resulting in multi-objective optimization (MOO) problems of stochastic type. We study the stochastic multi-gradient (SMG) method, seen as an extension of the classical stochastic gradient method for single-objective optimization. At each iteration of the SMG method, a stochastic multi-gradient direction is calculated by solving a quadratic subproblem, and it is shown that this direction is biased even when all individual gradient estimators are unbiased. We establish rates to compute a point in the Pareto front, of order similar to what is known for stochastic gradient in both convex and strongly convex cases. The analysis handles the bias in the multi-gradient and the unknown a priori weights of the limiting Pareto point. The SMG method is framed into a Pareto-front type algorithm for the computation of the entire Pareto front. The Pareto-front SMG algorithm is capable of robustly determining Pareto fronts for a number of synthetic test problems. One can apply it to any stochastic MOO problem arising from supervised machine learning, and we report results for logistic binary classification where multiple objectives correspond to distinct-sources data groups.
翻译:优化相互矛盾的功能在决策中至关重要,真实世界应用经常涉及不确定或未知的数据,导致多目标优化(MOO)问题。我们研究Stochantic多梯度(SMG)方法(SMG),该方法被视为传统随机梯度梯度方法用于单一目标优化的延伸。在SMG方法的每一次迭代中,通过解决四方子问题来计算一个随机多梯度多梯度方向,并显示这一方向有偏向,即使所有单个梯度估计器都不带偏见。我们确定在Pareto前方计算一个点的速率,该点与在 convex 和 强共振度情况下已知的随机梯度相似。该分析处理多梯度偏差和未知的先验性分数。SMMG方法在计算整个Pareto前方的分数时也采用Pareto前方算法。Pareto Slog算法中有一个点点点点点点,从Spereto sqrial 测试中可以将一个精确的系统化的系统端数测试结果。