A Symmetric Linearized Alternating Direction Method of Multipliers for a Class of Stochastic Optimization Problems
作者机构:Networked Supporting Software International S&T Cooperation Base of ChinaJiangxi Normal UniversityNanchang 330022China
出 版 物:《Journal of Systems Science and Information》 (系统科学与信息学报(英文))
年 卷 期:2023年第11卷第1期
页 面:58-77页
核心收录:
学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 07[理学] 070105[理学-运筹学与控制论] 0701[理学-数学]
基 金:Supported by National Natural Science Foundation of China (61662036)
主 题:alternating direction method of multipliers stochastic approximation expected convergence rate and high probability bound convex optimization machine learning
摘 要:Alternating direction method of multipliers(ADMM)receives much attention in the recent years due to various demands from machine learning and big data related *** 2013,Ouyang et *** the ADMM to the stochastic setting for solving some stochastic optimization problems,inspired by the structural risk minimization *** this paper,we consider a stochastic variant of symmetric ADMM,named symmetric stochastic linearized ADMM(SSL-ADMM).In particular,using the framework of variational inequality,we analyze the convergence properties of ***,we show that,with high probability,SSL-ADMM has O((ln N)·N^(-1/2))constraint violation bound and objective error bound for convex problems,and has O((ln N)^(2)·N^(-1))constraint violation bound and objective error bound for strongly convex problems,where N is the iteration *** ADMM can improve the algorithmic performance compared to classical ADMM,numerical experiments for statistical machine learning show that such an improvement is also present in the stochastic setting.