The rate-distortion function (RDF) has long been an information-theoretic benchmark for data compression. As its natural extension, the indirect rate-distortion function (iRDF) corresponds to the scenario where the encoder can only access an observation correlated with the source, rather than the source itself. Such scenario is also relevant for modern applications like remote sensing and goal-oriented communication. The iRDF can be reduced into a standard RDF with the distortion measure replaced by its conditional expectation conditioned upon the observation. This reduction, however, leads to a non-trivial challenge when one needs to estimate the iRDF given datasets only, because without statistical knowledge of the joint probability distribution between the source and its observation, the conditional expectation cannot be evaluated. To tackle this challenge, starting from the well known fact that conditional expectation is the minimum mean-squared error estimator and exploiting a Markovian relationship, we identify a functional equivalence between the reduced distortion measure in the iRDF and the solution of a quadratic loss minimization problem, which can be efficiently approximated by neural network approach. We proceed to reformulate the iRDF as a variational problem corresponding to the Lagrangian representation of the iRDF curve, and propose a neural network based approximate solution, integrating the aforementioned distortion measure estimator. Asymptotic analysis guarantees consistency of the solution, and numerical experimental results demonstrate the accuracy and effectiveness of the algorithm.
翻译:暂无翻译