咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Robust Text Detection in Natur... 收藏

Robust Text Detection in Natural Scenes Using Text Geometry and Visual Appearance

Robust Text Detection in Natural Scenes Using Text Geometry and Visual Appearance

作     者:Sheng-Ye Yan Xin-Xing Xu Qing-Shan Liu 

作者机构:School of Information and Control Nanjing University of Information Science and Technology School of Computer Engineering Nanyang Technological University 

出 版 物:《International Journal of Automation and computing》 (国际自动化与计算杂志(英文版))

年 卷 期:2014年第11卷第5期

页      面:480-488页

核心收录:

学科分类:081203[工学-计算机应用技术] 08[工学] 0835[工学-软件工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:supported by National Natural Science Foundation of China(Nos.61300163,61125106 and 61300162) Jiangsu Key Laboratory of Big Data Analysis Technology 

主  题:Text detection geometric rule stroke width transform (SWT) support vector machine (SVM) multiple kernel learning (MKL) 

摘      要:This paper proposes a new two-phase approach to robust text detection by integrating the visual appearance and the geometric reasoning rules. In the first phase, geometric rules are used to achieve a higher recall rate. Specifically, a robust stroke width transform(RSWT) feature is proposed to better recover the stroke width by additionally considering the cross of two strokes and the continuousness of the letter border. In the second phase, a classification scheme based on visual appearance features is used to reject the false alarms while keeping the recall rate. To learn a better classifier from multiple visual appearance features, a novel classification method called double soft multiple kernel learning(DS-MKL) is proposed. DS-MKL is motivated by a novel kernel margin perspective for multiple kernel learning and can effectively suppress the influence of noisy base kernels. Comprehensive experiments on the benchmark ICDAR2005 competition dataset demonstrate the effectiveness of the proposed two-phase text detection approach over the state-of-the-art approaches by a performance gain up to 4.4% in terms of F-measure.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分