Image Tagging by Semantic Neighbor Learning Using User-Contributed Social Image Datasets
Image Tagging by Semantic Neighbor Learning Using User-Contributed Social Image Datasets作者机构:the School of Computer and Information Technology Northeast Petroleum University Daqing 163318 China Xukun Shen is with State Key Laboratory of Virtual Reality Technology and Systems Beihang University Beijing 100191 China.
出 版 物:《Tsinghua Science and Technology》 (清华大学学报(自然科学版(英文版))
年 卷 期:2017年第22卷第6期
页 面:551-563页
核心收录:
学科分类:08[工学] 080203[工学-机械设计及理论] 0802[工学-机械工程]
基 金:supported in part by the National Natural Science Foundation of China(Nos.61502094 and 61402099) Natural Science Foundation of Heilongjiang Province of China(Nos.F2016002 and F2015020)
主 题:image tag social image tagging user contributed datasets semantic neighbor learning
摘 要:The explosive increase in the number of images on the Internet has brought with it the great challenge of how to effectively index, retrieve, and organize these resources. Assigning proper tags to the visual content is key to the success of many applications such as image retrieval and content mining. Although recent years have witnessed many advances in image tagging, these methods have limitations when applied to high-quality and large-scale training data that are expensive to obtain. In this paper, we propose a novel semantic neighbor learning method based on user-contributed social image datasets that can be acquired from the Web's inexhaustible social image content. In contrast to existing image tagging approaches that rely on high-quality image-tag supervision, we acquire weak supervision of our neighbor learning method by progressive neighborhood retrieval from noisy and diverse user-contributed image collections. The retrieved neighbor images are not only visually alike and partially correlated but also semantically related. We offer a step-by-step and easy-to-use implementation for the proposed method. Extensive experimentation on several datasets demonstrates that the performance of the proposed method significantly outperforms others.