Today, we are in the era of big data, and data are becoming more and more important, especially private data. Secure Multi-party Computation (SMPC) technology enables parties to perform computing tasks without revealing original data. However, the underlying implementation of SMPC is too heavy, such as garbled circuit (GC) and oblivious transfer(OT). Every time a piece of data is added, the resources consumed by GC and OT will increase a lot. Therefore, it is unacceptable to process large-scale data in a single SMPC task. In this work, we propose a novel theory called SMPC Task Decomposition (SMPCTD), which can securely decompose a single SMPC task into multiple SMPC sub-tasks and multiple local tasks without leaking the original data. After decomposition, the computing time, memory and communication consumption drop sharply. We then decompose three machine learning (ML) SMPC tasks using our theory and implement them based on a hybrid protocol framework called ABY. Furthermore, we use incremental computation technique to expand the amount of data involved in these three SMPC tasks. The experimental results show that after decomposing these three SMPC tasks, the time, memory and communication consumption are not only greatly reduced, but also stabilized within a certain range.
翻译:今天,我们正处于大数据时代,数据正在变得越来越重要,特别是私人数据。安全多方计算技术使各方能够在不披露原始数据的情况下完成计算任务。然而,SMPC的基本实施太重,例如混凝土电路和盲目传输。每当增加一个数据,GC和OT消耗的资源就会大大增加。因此,在一个名为ABY的混合协议框架内处理大规模数据是无法接受的。此外,我们提出一个名为SMPC任务分解(SMPC)的新理论,可以将SMPC任务分解成多个SMPC子任务和多个本地任务,而不会泄露原始数据。在分解后,计算时间、记忆和通信消耗会急剧下降。我们随后用我们理论来消除机器学习SMPC的三项任务,并根据一个称为ABY的混合协议框架执行。此外,我们使用递增计算技术来扩大SMPC任务所涉数据的数量,这三项SMPC任务在SMPC的记忆范围内,实验结果显示SMPC任务在逐渐减少,而SMMM的记忆范围也大大缩小。</s>