|Institution:||The Ohio State University|
|Full text PDF:||http://rave.ohiolink.edu/etdc/view?acc_num=osu1243990513|
Mixture distributions have been given considerable attention due totheir flexible form and convenience of use. MarkovChain Monte Carlo (MCMC) methods enable us to generate samples from a targetdistribution from which it is difficult to sample directlyby simulating a Markov chain. However, practicaldifficulties arise when MCMC methods are implemented to fit mixturedistributions with several isolated modes. Most MCMC sampling methodshave difficulties transitioning between the isolated modal regions andthe inferences based on the samples generated by these methods can beunreliable. This motivated us to develop efficient algorithms forfitting Bayesian mixture models. Our approach hinges onthe premise that a preliminary understanding of some essentialfeatures of the posterior distribution is needed to make sampling moreefficient. In this thesis we introduce two algorithms that rely onan initial identification of possible isolated modes of the mixture distribution. The algorithms are applied to fitfour different models: a Bayesian univariate normal mixture model;a Bayesian univariate outlier accommodation model; a Bayesian linearregression model; and a hierarchical Bayesian regression model forrepeated measures data. Their performance is compared to that of othermethods including the Gibbs sampler and an MCMC tempering transitionmethod by examining the accuracy of inferences and the ease oftransition between isolated modal regions of the posteriordistributions for the Bayesian models. The results show that theproposed algorithms outperform the Gibbs sampler and the temperingtransition method.