咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Applying Feature-Weighted Grad... 收藏

Applying Feature-Weighted Gradient Decent K-Nearest Neighbor to Select Promising Projects for Scientific Funding

作     者:Chuqing Zhang Jiangyuan Yao Guangwu Hu Thomas Schott 

作者机构:School of Economics and ManagementNorth China Electric Power UniversityBeijing102206China School of Computer Science&Cyberspace SecurityHainan UniversityHaikou570228China School of Computer ScienceShenzhen Institute of Information TechnologyShenzhen518172China The Faculty of Business and Social ScienceUniversity of Southern DenmarkKoldingDK-6000Denmark 

出 版 物:《Computers, Materials & Continua》 (计算机、材料和连续体(英文))

年 卷 期:2020年第64卷第9期

页      面:1741-1753页

核心收录:

学科分类:08[工学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:J.Yao would like to thank the support of Program of Hainan Association for Science and Technology Plans to Youth R&D Innovation[QCXM201910] Scientific Research Setup Fund of Hainan University[KYQD(ZR)1837] the National Natural Science Foundation of China G.Hu would like to thank the support of Fundamental Research Project of Shenzhen Municipality[JCYJ20170817115335418] 

主  题:FGDKNN project selection scientific funding machine learning 

摘      要:Due to its outstanding ability in processing large quantity and high-dimensional data,machine learning models have been used in many cases,such as pattern recognition,classification,spam filtering,data mining and *** an outstanding machine learning algorithm,K-Nearest Neighbor(KNN)has been widely used in different situations,yet in selecting qualified applicants for winning a funding is almost *** major problem lies in how to accurately determine the importance of *** this paper,we propose a Feature-weighted Gradient Decent K-Nearest Neighbor(FGDKNN)method to classify funding applicants in to two types:approved ones or not approved *** FGDKNN is based on a gradient decent learning algorithm to update *** updates the weight of labels by minimizing error ratio iteratively,so that the importance of attributes can be described *** investigate the performance of FGDKNN with Beijing *** results show that FGDKNN performs about 23%,20%,18%,15%better than KNN,SVM,DT and ANN,***,the FGDKNN has fast convergence time under different training scales,and has good performance under different settings.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分