咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Information Divergence and the... 收藏

Information Divergence and the Generalized Normal Distribution:A Study on Symmetricity

作     者:Thomas L.Toulias Christos P.Kitsos 

作者机构:Department of Electrical and Electronics EngineeringUniversity of West AtticaCampus 1Egaleo12243 AthensGreece Department of Informatics and Computer EngineeringUniversity of West AtticaCampus 1Egaleo12243 AthensGreece 

出 版 物:《Communications in Mathematics and Statistics》 (数学与统计通讯(英文))

年 卷 期:2021年第9卷第4期

页      面:439-465页

核心收录:

学科分类:07[理学] 0714[理学-统计学(可授理学、经济学学位)] 0701[理学-数学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 070101[理学-基础数学] 

主  题:Kullback-Leibler divergence Jeffreys distance Resistor-average distance Multivariateγ-order normal distribution Multivariate Student’s t-distribution Multivariate Laplace distribution 

摘      要:This paper investigates and discusses the use of information divergence,through the widely used Kullback–Leibler(KL)divergence,under the multivariate(generalized)γ-order normal distribution(γ-GND).The behavior of the KL divergence,as far as its symmetricity is concerned,is studied by calculating the divergence of γ-GND over the Student’s multivariate t-distribution and vice *** special cases are also given and ***,three symmetrized forms of the KL divergence,i.e.,the Jeffreys distance,the geometric-KL as well as the harmonic-KL distances,are computed between two members of the γ-GND family,while the corresponding differences between those information distances are also discussed.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分