Recently, the distribution-dependent Mumford-Shah model for hyperspectral image segmentation was introduced. It approximates an image based on first and second order statistics using a data term, that is built of a Mahalanobis distance plus a covariance regularization, and the total variation as spatial regularization. Moreover, to achieve feasibility, the appearing matrices are restricted to symmetric positive definite ones with eigenvalues exceeding a certain threshold. This threshold is chosen in advance as a data-independent parameter. In this article, we study theoretical properties of the model. In particular, we prove the existence of minimizers of the functional and show its $\Gamma$-convergence when the threshold regularizing the eigenvalues of the matrices tends to zero. It turns out that in the $\Gamma$-limit we lose the guaranteed existence of minimizers; and we give an example of an image where the $\Gamma$-limit indeed has no minimizer. Finally, we derive a formula for the minimum eigenvalues of the covariance matrices appearing in the functional that hints under which conditions the functional is able to handle the data without regularizing the eigenvalues. The results of this article demonstrate the significance and importance of the eigenvalue regularization to the model and that it cannot be dropped without substantial modifications.
翻译:暂无翻译