In this paper we study a worst case to average case reduction for the problem of matrix multiplication over finite fields. Suppose we have an efficient average case algorithm, that given two random matrices $A,B$ outputs a matrix that has a non-trivial correlation with their product $A \cdot B$. Can we transform it into a worst case algorithm, that outputs the correct answer for all inputs without incurring a significant overhead in the running time? We present two results in this direction. (1) Two-sided error in the high agreement regime: We begin with a brief remark about a reduction for high agreement algorithms, i.e., an algorithm which agrees with the correct output on a large (say $>0.9$) fraction of entries, and show that the standard self-correction of linearity allows us to transform such algorithms into algorithms that work in worst case. (2) One-sided error in the low agreement regime: Focusing on average case algorithms with one-sided error, we show that over $\mathbb{F}_2$ there is a reduction that gets an $O(T)$ time average case algorithm that given a random input $A,B$ outputs a matrix that agrees with $A \cdot B$ on at least $51\%$ of the entries (i.e., has only a slight advantage over the trivial algorithm), and transforms it into an $\widetilde{O}(T)$ time worst case algorithm, that outputs the correct answer for all inputs with high probability.
翻译:暂无翻译