咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Differentially private SGD wit... 收藏

Differentially private SGD with random features

作     者:WANG Yi-guang GUO Zheng-chu WANG Yi-guang;GUO Zheng-chu

作者机构:Polytechnic Institute of Zhejiang UniversityZhejiang UniversityHangzhou 310015China School of Mathematical SciencesZhejiang UniversityHangzhou 310058China 

出 版 物:《Applied Mathematics(A Journal of Chinese Universities)》 (高校应用数学学报(英文版)(B辑))

年 卷 期:2024年第39卷第1期

页      面:1-23页

核心收录:

学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 081104[工学-模式识别与智能系统] 0839[工学-网络空间安全] 08[工学] 0835[工学-软件工程] 081201[工学-计算机系统结构] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:supported by Zhejiang Provincial Natural Science Foundation of China(LR20A010001) National Natural Science Foundation of China(12271473 and U21A20426) 

主  题:learning theory differential privacy stochastic gradient descent random features reproducing kernel Hilbert spaces 

摘      要:In the realm of large-scale machine learning,it is crucial to explore methods for reducing computational complexity and memory demands while maintaining generalization ***,since the collected data may contain some sensitive information,it is also of great significance to study privacy-preserving machine learning *** paper focuses on the performance of the differentially private stochastic gradient descent(SGD)algorithm based on random *** begin,the algorithm maps the original data into a lowdimensional space,thereby avoiding the traditional kernel method for large-scale data storage ***,the algorithm iteratively optimizes parameters using the stochastic gradient descent ***,the output perturbation mechanism is employed to introduce random noise,ensuring algorithmic *** prove that the proposed algorithm satisfies the differential privacy while achieving fast convergence rates under some mild conditions.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分