首页 | 本学科首页   官方微博 | 高级检索  
     

误差准则的信息论解释
引用本文:陈霸东,胡金春,朱煜,孙增圻.误差准则的信息论解释[J].自动化学报,2009,35(10):1302-1309.
作者姓名:陈霸东  胡金春  朱煜  孙增圻
作者单位:1.Institute of Manufacturing Engineering, Department of Precision Instruments and Mechanology, Tsinghua University, Beijing 100084, P.R. China
摘    要:Error criteria (or error cost functions) play significant roles in statistical estimation problems. In this paper, we study error criteria from the viewpoint of information theory. The relationships between error criteria and error's entropy criterion are investigated. It is shown that an error criterion is equivalent to the error's entropy criterion plus a Kullback-Leibler information divergence (KL-divergence). Based on this result, two important properties of the error criteria are proved. Particularly, the optimum error criterion can be interpreted via the meanings of entropy and KL-divergence. Furthermore, a novel approach is proposed for the choice of p-power error criteria, in which a KL-divergence based cost is minimized. The proposed method is verified by Monte Carlo simulation experiments.

关 键 词:Estimation    error  criteria    entropy    Kullback-Leibler  information  divergence  (KL-divergence)    adaptive  filtering
收稿时间:2008-4-7
修稿时间:2008-12-20

Information Theoretic Interpretation of Error Criteria
CHEN Ba-Dong,HU Jin-Chun,ZHU Yu,SUN Zeng-Qi.Information Theoretic Interpretation of Error Criteria[J].Acta Automatica Sinica,2009,35(10):1302-1309.
Authors:CHEN Ba-Dong  HU Jin-Chun  ZHU Yu  SUN Zeng-Qi
Affiliation:1.Institute of Manufacturing Engineering, Department of Precision Instruments and Mechanology, Tsinghua University, Beijing 100084, P.R. China;2.State Key Laboratory of Intelligent Technology and Systems, Department of Computer Science and Technology, Tsinghua University, Beijing 100084, P.R. China
Abstract:Error criteria (or error cost functions) play significant roles in statistical estimation problems. In this paper, we study error criteria from the viewpoint of information theory. The relationships between error criteria and error's entropy criterion are investigated. It is shown that an error criterion is equivalent to the error's entropy criterion plus a Kullback-Leibler information divergence (KL-divergence). Based on this result, two important properties of the error criteria are proved. Particularly, the optimum error criterion can be interpreted via the meanings of entropy and KL-divergence. Furthermore, a novel approach is proposed for the choice of p-power error criteria, in which a KL-divergence based cost is minimized. The proposed method is verified by Monte Carlo simulation experiments.
Keywords:Estimation  error criteria  entropy  Kullback-Leibler information divergence (KL-divergence)  adaptive filtering
点击此处可从《自动化学报》浏览原始摘要信息
点击此处可从《自动化学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号