首页 | 本学科首页   官方微博 | 高级检索  
     


Dirichlet Process Gaussian Mixture Models: Choice of the Base Distribution
Authors:Dilan Görür  Carl Edward Rasmussen
Affiliation:1.Gatsby Computational Neuroscience Unit,University College London,London,U.K.;2.Department of Engineering,University of Cambridge,Cambridge,U.K.;3.Max Planck Institute for Biological Cybernetics,Tübingen,Germany
Abstract:In the Bayesian mixture modeling framework it is possible to infer the necessary number of components to model the data and therefore it is unnecessary to explicitly restrict the number of components. Nonparametric mixture models sidestep the problem of finding the “correct” number of mixture components by assuming infinitely many components. In this paper Dirichlet process mixture (DPM) models are cast as infinite mixture models and inference using Markov chain Monte Carlo is described. The specification of the priors on the model parameters is often guided by mathematical and practical convenience. The primary goal of this paper is to compare the choice of conjugate and non-conjugate base distributions on a particular class of DPM models which is widely used in applications, the Dirichlet process Gaussian mixture model (DPGMM). We compare computational efficiency and modeling performance of DPGMM defined using a conjugate and a conditionally conjugate base distribution. We show that better density models can result from using a wider class of priors with no or only a modest increase in computational effort.
Keywords:
本文献已被 万方数据 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号