Statistical Inference for Models with Intractable Normalizing Constants

 dc.contributor Liang, Faming dc.creator Jin, Ick Hoon dc.date.accessioned 2013-12-16T19:55:02Z dc.date.accessioned 2017-04-07T20:05:05Z dc.date.available 2013-12-16T19:55:02Z dc.date.available 2017-04-07T20:05:05Z dc.date.created 2011-08 dc.date.issued 2011-06-27 dc.description.abstract In this dissertation, we have proposed two new algorithms for statistical inference for models with intractable normalizing constants: the Monte Carlo Metropolis-Hastings algorithm and the Bayesian Stochastic Approximation Monte Carlo algorithm. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. At each iteration, it replaces the unknown normalizing constant ratio by a Monte Carlo estimate. Although the algorithm violates the detailed balance condition, it still converges, as shown in the paper, to the desired target distribution under mild conditions. The BSAMC algorithm works by simulating from a sequence of approximated distributions using the SAMC algorithm. A strong law of large numbers has been established for BSAMC estimators under mild conditions. One significant advantage of our algorithms over the auxiliary variable MCMC methods is that they avoid the requirement for perfect samples, and thus it can be applied to many models for which perfect sampling is not available or very expensive. In addition, although the normalizing constant approximation is also involved in BSAMC, BSAMC can perform very robustly to initial guesses of parameters due to the powerful ability of SAMC in sample space exploration. BSAMC has also provided a general framework for approximated Bayesian inference for the models for which the likelihood function is intractable: sampling from a sequence of approximated distributions with their average converging to the target distribution. With these two illustrated algorithms, we have demonstrated how the SAMCMC method can be applied to estimate the parameters of ERGMs, which is one of the typical examples of statistical models with intractable normalizing constants. We showed that the resulting estimate is consistent, asymptotically normal and asymptotically efficient. Compared to the MCMLE and SSA methods, a significant advantage of SAMCMC is that it overcomes the model degeneracy problem. The strength of SAMCMC comes from its varying truncation mechanism, which enables SAMCMC to avoid the model degeneracy problem through re-initialization. MCMLE and SSA do not possess the re-initialization mechanism, and tend to converge to a solution near the starting point, so they often fail for the models which suffer from the model degeneracy problem. dc.identifier.uri http://hdl.handle.net/1969.1/150938 dc.subject Autulogistic Model dc.subject Ising Model dc.subject Stochastic Approximation Monte Carlo dc.subject Exponential Random Graph Models dc.subject Markov Chain Monte Carlo dc.subject Intractable Normalizing Constants dc.title Statistical Inference for Models with Intractable Normalizing Constants dc.type Thesis