咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Overhead-free Noise-tolerant F... 收藏

Overhead-free Noise-tolerant Federated Learning: A New Baseline

作     者:Shiyi Lin Deming Zhai Feilong Zhang Junjun Jiang Xianming Liu Xiangyang Ji Shiyi Lin;Deming Zhai;Feilong Zhang;Junjun Jiang;Xianming Liu;Xiangyang Ji

作者机构:Department of Computer Science and TechnologyHarbin Institute of TechnologyHarbin150000China Department of AutomationTsinghua UniversityBeijing100084China 

出 版 物:《Machine Intelligence Research》 (机器智能研究(英文版))

年 卷 期:2024年第21卷第3期

页      面:526-537页

核心收录:

学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 081104[工学-模式识别与智能系统] 08[工学] 0835[工学-软件工程] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:supported by National Natural Science Foundation of China(Nos.92270116 and 62071155) 

主  题:Federated learning noise-label learning privacy-preserving machine learning edge intelligence distributed machine learning 

摘      要:Federated learning (FL) is a promising decentralized machine learning approach that enables multiple distributed clients to train a model jointly while keeping their data private. However, in real-world scenarios, the supervised training data stored in local clients inevitably suffer from imperfect annotations, resulting in subjective, inconsistent and biased labels. These noisy labels can harm the collaborative aggregation process of FL by inducing inconsistent decision boundaries. Unfortunately, few attempts have been made towards noise-tolerant federated learning, with most of them relying on the strategy of transmitting overhead messages to assist noisy labels detection and correction, which increases the communication burden as well as privacy risks. In this paper, we propose a simple yet effective method for noise-tolerant FL based on the well-established co-training framework. Our method leverages the inherent discrepancy in the learning ability of the local and global models in FL, which can be regarded as two complementary views. By iteratively exchanging samples with their high confident predictions, the two models “teach each other to suppress the influence of noisy labels. The proposed scheme enjoys the benefit of overhead cost-free and can serve as a robust and efficient baseline for noise-tolerant federated learning. Experimental results demonstrate that our method outperforms existing approaches, highlighting the superiority of our method.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分