Orthogonal nonnegative learning for sparse feature extraction and approximate combinatorial optimization
Orthogonal nonnegative learning for sparse feature extraction and approximate combinatorial optimization作者机构:Department of Information and Computer ScienceAalto UniversityFI-00076 AaltoEspooFinland
出 版 物:《Frontiers of Electrical and Electronic Engineering in China》 (中国电气与电子工程前沿(英文版))
年 卷 期:2010年第5卷第3期
页 面:261-273页
学科分类:0808[工学-电气工程] 0809[工学-电子科学与技术(可授工学、理学学位)] 07[理学] 0805[工学-材料科学与工程(可授工学、理学学位)] 0701[理学-数学] 0702[理学-物理学] 070101[理学-基础数学]
主 题:nonnegative factorization sparse feature extraction orthogonal learning clustering
摘 要:Nonnegativity has been shown to be a powerful principle in linear matrix decompositions,leading to sparse component matrices in feature analysis and data *** classical method is Lee and Seung’s Nonnegative Matrix Factorization.A standard way to form learning rules is by multiplicative updates,maintaining ***,a generic principle is presented for forming multiplicative update rules,which integrate an orthonormality constraint into nonnegative *** principle,called Orthogonal Nonnegative Learning(ONL),is rigorously derived from the Lagrangian *** examples,the proposed method is applied for transforming Nonnegative Matrix Factorization(NMF)and its variant,Projective Nonnegative Matrix Factorization(PNMF),into their orthogonal *** general,it is well-known that orthogonal nonnegative learning can give very useful approximative solutions for problems involving non-vectorial data,for example,binary *** optimization is replaced by continuous-space gradient optimization which is often computationally *** is shown how the multiplicative updates rules obtained by using the proposed ONL principle can find a nonnegative and highly orthogonal matrix for an approximated graph partitioning *** empirical results on various graphs indicate that our nonnegative learning algorithms not only outperform those without the orthogonality condition,but also surpass other existing partitioning approaches.