We study the time complexity of computing the $(\min,+)$ matrix product of two $n\times n$ integer matrices in terms of $n$ and the number of monotone subsequences the rows of the first matrix and the columns of the second matrix can be decomposed into. In particular, we show that if each row of the first matrix can be decomposed into at most $m_1$ monotone subsequences and each column of the second matrix can be decomposed into at most $m_2$ monotone subsequences such that all the subsequences are non-decreasing or all of them are non-increasing then the $(\min,+)$ product of the matrices can be computed in $O(m_1m_2n^{2.569})$ time. On the other hand, we observe that if all the rows of the first matrix are non-decreasing and all columns of the second matrix are non-increasing or {\em vice versa} then this case is as hard as the general one. Similarly, we also study the time complexity of computing the $(\min,+)$ convolution of two $n$-dimensional integer vectors in terms of $n$ and the number of monotone subsequences the two vectors can be decomposed into. We show that if the first vector can be decomposed into at most $m_1$ monotone subsequences and the second vector can be decomposed into at most $m_2$ subsequences such that all the subsequences of the first vector are non-decreasing and all the subsequences of the second vector are non-increasing or {\em vice versa} then their $(\min,+)$ convolution can be computed in $\tilde{O}(m_1m_2n^{1.5})$ time. On the other, the case when both vectors are non-decreasing or both of them are non-increasing is as hard as the general case.
翻译:暂无翻译