Parametrized max-affine (PMA) and parametrized log-sum-exp (PLSE) networks are proposed for general decision-making problems. The proposed approximators generalize existing convex approximators, namely, max-affine (MA) and log-sum-exp (LSE) networks, by considering function arguments of condition and decision variables and replacing the network parameters of MA and LSE networks with continuous functions with respect to the condition variable. The universal approximation theorem of PMA and PLSE is proven, which implies that PMA and PLSE are shape-preserving universal approximators for parametrized convex continuous functions. Practical guidelines for incorporating deep neural networks within PMA and PLSE networks are provided. A numerical simulation is performed to demonstrate the performance of the proposed approximators. The simulation results support that PLSE outperforms other existing approximators in terms of minimizer and optimal value errors with scalable and efficient computation for high-dimensional cases.
翻译:提议为一般决策问题建议了顶部和顶部对数交换网(PMA)和顶部对数交换网(PLSE)网络,其中考虑到条件和决定变量的功能参数,并以条件变量方面的连续功能取代MA和LSE网络的网络参数。PMA和PLSE的通用近似理论得到证明,这意味着PMA和PLSE是对准平衡连续功能保持形状的通用近似器。提供了将深线神经网络纳入PMA和PLSE网络的实用准则。进行了数字模拟,以证明拟议的近线交换网的性能。模拟结果支持PLSE在尽量减少和最佳价值错误方面比其他现有的近似器要强,为高维案例提供可调整和高效的计算。