We prove that deep neural networks with ReLU activation function are capable of approximating solutions of semilinear partial integro-differential equations in the case of gradient-independent and Lipschitz-continuous nonlinearities, while the required number of parameters in the neural networks grows at most polynomially in both the dimension $ d\in\mathbb{N} $ and the reciprocal of the prescribed accuracy $ \epsilon $.
翻译:暂无翻译