A matrix algorithm runs at {\em sublinear cost} if it uses much fewer memory cells and arithmetic operations than the input matrix has entries. Such algorithms are indispensable for Big Data Mining and Analysis. Quite typically in that area the input matrices are so immense that realistically one can only access a small fraction of all their entries but can access and process at sublinear cost their Low Rank Approximation {\em (LRA)}. Can, however, we compute LRA at sublinear cost? Adversary argument shows that the output of any algorithm running at sublinear cost is extremely far from LRA of the worst case input matrices and even of the matrices of small families of our Appendix, but we prove that some deterministic sublinear cost algorithms output reasonably close LRA in a memory efficient form of CUR LRA if an input matrix admits LRA and is Symmetric Positive Semidefinite or is very close to a low rank matrix. The latter result is technically simple but provides some (very limited but long overdue) support for the well-known empirical efficiency of sublinear cost LRA by means of Cross-Approximation. We demonstrate the power of application of such LRA by turning the Fast Multipole celebrated Method into Superfast Multipole Method. The design and analysis of our algorithms rely on extensive prior study of the link of LRA of a matrix to maximization of its volume.


翻译:如果使用比输入矩阵少得多的内存单元格和算术操作,则矩阵算法运行成本值为 = = = = = = = = = = = = = = = = = 如果使用比 = = = = = = = = = = = = = = 如果使用比 = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

0
下载
关闭预览

相关内容

Linux导论,Introduction to Linux,96页ppt
专知会员服务
82+阅读 · 2020年7月26日
专知会员服务
61+阅读 · 2020年3月19日
强化学习最新教程,17页pdf
专知会员服务
182+阅读 · 2019年10月11日
ICLR2019最佳论文出炉
专知
12+阅读 · 2019年5月6日
强化学习的Unsupervised Meta-Learning
CreateAMind
18+阅读 · 2019年1月7日
A Technical Overview of AI & ML in 2018 & Trends for 2019
待字闺中
18+阅读 · 2018年12月24日
【论文】变分推断(Variational inference)的总结
机器学习研究会
39+阅读 · 2017年11月16日
【推荐】RNN/LSTM时序预测
机器学习研究会
25+阅读 · 2017年9月8日
Arxiv
0+阅读 · 2021年5月25日
Arxiv
3+阅读 · 2018年10月18日
VIP会员
相关VIP内容
Linux导论,Introduction to Linux,96页ppt
专知会员服务
82+阅读 · 2020年7月26日
专知会员服务
61+阅读 · 2020年3月19日
强化学习最新教程,17页pdf
专知会员服务
182+阅读 · 2019年10月11日
相关资讯
ICLR2019最佳论文出炉
专知
12+阅读 · 2019年5月6日
强化学习的Unsupervised Meta-Learning
CreateAMind
18+阅读 · 2019年1月7日
A Technical Overview of AI & ML in 2018 & Trends for 2019
待字闺中
18+阅读 · 2018年12月24日
【论文】变分推断(Variational inference)的总结
机器学习研究会
39+阅读 · 2017年11月16日
【推荐】RNN/LSTM时序预测
机器学习研究会
25+阅读 · 2017年9月8日
相关论文
Arxiv
0+阅读 · 2021年5月25日
Arxiv
3+阅读 · 2018年10月18日
Top
微信扫码咨询专知VIP会员