In decision-making problem under uncertainty, predicting unknown parameters is often considered independent of the optimization part. Decision-focused Learning (DFL) is a task-oriented framework to integrate prediction and optimization by adapting predictive model to give better decision for the corresponding task. Here, an inevitable challenge arises when computing gradients of the optimal decision with respect to the parameters. Existing researches cope this issue by smoothly reforming surrogate optimization or construct surrogate loss function that mimic task loss. However, they are applied to restricted optimization domain. In this paper, we propose Locally Convex Global Loss Network (LCGLN), a global surrogate loss model which can be implemented in a general DFL paradigm. LCGLN learns task loss via partial input convex neural network which is guaranteed to be convex for chosen inputs, while keeping the non-convex global structure for the other inputs. This enables LCGLN to admit general DFL through only a single surrogate loss without any sense for choosing appropriate parametric forms. We confirm effectiveness and flexibility of LCGLN by evaluating our proposed model with three stochastic decision-making problems.
翻译:暂无翻译