This paper presents some theoretical results relating the Bregman log determinant matrix divergence to Kaporin's condition number. These can be viewed as nearness measures between a preconditioner and a given matrix, and we show under which conditions these two functions coincide. We also give examples of constraint sets over which it is equivalent to minimise these two objectives. We focus on preconditioners that are the sum of a positive definite and low-rank matrix, which were developed in a previous work. These were constructed as minimisers of the aforementioned divergence, and we show that they are only a constant scaling from also minimising Kaporin's condition number. We highlight connections to information geometry and comment on future directions.
翻译:暂无翻译