项目名称: 基于微型批量采样的分布式多智能个体网络协同优化算法研究
项目编号: No.61503308
项目类型: 青年科学基金项目
立项/批准年度: 2016
项目学科: 自动化技术、计算机技术
项目作者: 王慧维
作者单位: 西南大学
项目金额: 21万元
中文摘要: 日益庞大的数据规模使得基于批处理的传统优化算法不再适用于求解大规模数据优化问题。本项目提出的基于微型批量梯度采样的分布式优化算法,既发挥了分布式优化算法低实施要求、高运行效率、强隐私保护以及抗通信干扰的特点,也体现了数据样本的泛化能力和并行实施批量采样的特点,是目前解决大规模数据优化问题的理想方案。通过随机扰动光滑化法处理优化问题的部分未知目标函数,借鉴镜面下降法、加速梯度法、临近梯度法、对偶平均法等思想,设计出一系列面向大规模数据处理的分布式优化算法,利用谱图理论和Bregman距离理论,分析算法的收敛误差、收敛速率和计算复杂性,并深入挖掘算法收敛性能与算法参数、网络标度和约束条件等参数之间的关系。利用实际数据集训练和检验算法,并逐步改进和提升算法性能。系统地研究基于微型批量梯度采样的分布式优化算法,对于探索面向大规模数据优化问题求解方法具有一定的推动作用。
中文关键词: 分布式优化;多智能个体网络;微型批量采样;一致性;大规模数据
英文摘要: The traditional optimization algorithms based on batching are no longer suitable for solving large scale data optimization problems due to increasingly large scale of data. The proposed distributed optimization algorithms based on mini-batch gradients sampling, both collecting the strengths of low implementation requirements, high efficiency, strong privacy protection and anti-interference of distributed optimization algorithms, and playing the characteristics on the generalization ability of the data sample and parallel implementation of batch sampling, are the ideal way to solve the large scale data optimization problems at present. By utilizing random disturbance smoothing technology for partly unknown objective functions, and referring to the thoughts of mirror descent methods, accelerating gradient methods, proximal gradient methods and dual-averaging methods, we could design a series of distributed optimization algorithms oriented large scale data. By using spectral graph theory and Bregman distance theory, we analyze the convergence error, convergence rate and computing complexity of the algorithms, and deeply explore the relationship among convergence performance of algorithms, algorithms parameters, the network scale as well as parameters of constraint conditions. We train algorithms and test algorithms by utilizing the actual data set, and gradually improve the algorithm performance. Systematically studying distributed optimization algorithms based on mini-batch gradients sampling, will have an impact on exploring the solving methods for the large scale data optimization problems.
英文关键词: Distributed optimization;multi-agents networks;mini-batch sampling;consensus;large-scale data