咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Large-scale self-normalizing n... 收藏

Large-scale self-normalizing neural networks

作     者:Zhaodong Chen Weiqin Zhao Lei Deng Yufei Ding Qinghao Wen Guoqi Li Yuan Xie 

作者机构:University of CaliforniaSanta BarbaraSanta Barbara93106CAUnited States University of Hong KongHong Kong999077China Tsinghua UniversityBeijing100084China University of CaliforniaSan DiegoSan Diego92093CAUnited States University of SydneySydney2006Australia Institute of AutomationChinese Academy of ScienceBeijing100190China University of Chinese Academy of ScienceBeijing100190China The Hong Kong University of Science and TechnologyHong Kong999077China 

出 版 物:《Journal of Automation and Intelligence》 (自动化与人工智能(英文))

年 卷 期:2024年第3卷第2期

页      面:101-110页

学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 081104[工学-模式识别与智能系统] 08[工学] 0835[工学-软件工程] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:National Key R&D Program of China(2018AAA0102600) National Natural Science Foundation of China(No.61876215,62106119) Beijing Academy of Artificial Intelligence(BAAI),China Chinese Institute for Brain Research,Beijing,and the Science and Technology Major Project of Guangzhou,China(202007030006) 

主  题:Self-normalizing neural network Mean-field theory Block dynamical isometry Activation function 

摘      要:Self-normalizing neural networks(SNN)regulate the activation and gradient flows through activation functions with the self-normalization *** SNNs do not rely on norms computed from minibatches,they are more friendly to data parallelism,kernel fusion,and emerging architectures such as ReRAM-based ***,existing SNNs have mainly demonstrated their effectiveness on toy datasets and fall short in accuracy when dealing with large-scale tasks like *** lack the strong normalization,regularization,and expression power required for wider,deeper models and larger-scale *** enhance the normalization strength,this paper introduces a comprehensive and practical definition of the self-normalization property in terms of the stability and attractiveness of the statistical fixed *** is comprehensive as it jointly considers all the fixed points used by existing studies:the first and second moment of forward activation and the expected Frobenius norm of backward *** practicality comes from the analytical equations provided by our paper to assess the stability and attractiveness of each fixed point,which are derived from theoretical analysis of the forward and backward *** proposed definition is applied to a meta activation function inspired by prior research,leading to a stronger self-normalizing activation function named‘‘bi-scaled exponential linear unit with backward standardized’’(bSELU-BSTD).We provide both theoretical and empirical evidence to show that it is superior to existing *** enhance the regularization and expression power,we further propose scaled-Mixup and channel-wise scale&*** these three techniques,our approach achieves 75.23%top-1 accuracy on the ImageNet with Conv MobileNet V1,surpassing the performance of existing self-normalizing activation *** the best of our knowledge,this is the first SNN that achieves comparable accuracy to batch normalization on ImageNet.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分