咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Multi-label active learning by... 收藏

Multi-label active learning by model guided distribution matching

Multi-label active learning by model guided distribution matching

作     者:Nengneng GAO Sheng-Jun HUANG Songcan CHEN 

作者机构:College of Computer Science and Technology Nanjing University of Aeronautics and AstronauticsCollaborative Innovation Center of Novel Software Technology and Industrialization Nanjing 211106 China 

出 版 物:《Frontiers of Computer Science》 (中国计算机科学前沿(英文版))

年 卷 期:2016年第10卷第5期

页      面:845-855页

核心收录:

学科分类:0810[工学-信息与通信工程] 12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 081901[工学-采矿工程] 0808[工学-电气工程] 0819[工学-矿业工程] 081104[工学-模式识别与智能系统] 08[工学] 0835[工学-软件工程] 0701[理学-数学] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:国家自然科学基金 the Jiangsu Science Foundation the CCF-Tencent Open Research Fund 

主  题:multi-label learning batch mode active learning distribution matching 

摘      要:Multi-label learning is an effective framework for learning with objects that have multiple semantic labels, and has been successfully applied into many real-world tasks, In contrast with traditional single-label learning, the cost of la- beling a multi-label example is rather high, thus it becomes an important task to train an effective multi-label learning model with as few labeled examples as possible. Active learning, which actively selects the most valuable data to query their labels, is the most important approach to reduce labeling cost. In this paper, we propose a novel approach MADM for batch mode multi-label active learning. On one hand, MADM exploits representativeness and diversity in both the feature and label space by matching the distribution between labeled and unlabeled data. On the other hand, it tends to query predicted positive instances, which are expected to be more informative than negative ones. Experiments on benchmark datasets demonstrate that the proposed approach can reduce the labeling cost significantly.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分