首页 | 本学科首页   官方微博 | 高级检索  
     


Variational learning for finite Beta-Liouville mixture models
Authors:LAI Yu-ping  ZHOU Ya-jian  PING Yuan  GUO Yu-cui  YANG Yi-xian  Information Security Center,Beijing University of Posts  Telecommunications  
Affiliation:LAI Yu-ping;ZHOU Ya-jian;PING Yuan;GUO Yu-cui;YANG Yi-xian;Information Security Center, Beijing University of Posts and Telecommunications;Department of Computer Science and Technology, Xuchang University;School of Science, Beijing University of Posts and Telecommunications;
Abstract:In the article, an improved variational inference (VI) framework for learning finite Beta-Liouville mixture models (BLM) is proposed for proportional data classification and clustering. Within the VI framework, some non-linear approximation techniques are adopted to obtain the approximated variational object functions. Analytical solutions are obtained for the variational posterior distributions. Compared to the expectation maximization (EM) algorithm which is commonly used for learning mixture models, underfitting and overfitting events can be prevented. Furthermore, parameters and complexity of the mixture model (model order) can be estimated simultaneously. Experiment shows that both synthetic and real-world data sets are to demonstrate the feasibility and advantages of the proposed method.
Keywords:variational inference   model selection   factorized approximation   Beta-Liouville distribution   mixing modeling
本文献已被 CNKI 维普 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号