Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
My second session was Michael Jordan’s Neyman lecture, which was well-attended despite the early hour (8:30). As usual, Michael gave a very well-articulated and broad talk. While the topic was rather close to the talks he gave in Edinburgh last year, I still got a new understanding about Bayesian non-parametrics, maybe because his Neyman talk was even more encompassing than earlier. (It also made me wonder whether or not we should incorporate some of this approach in Bayesian Core, sorry Bayesian Essentials with R, presumably not because we are aiming at a lower complexity….) A provocative introductory sentence by Michael: “I do not like priors”, maybe a tribute to Neyman?!
My third session was the one I organised (with the blessing of ISBA) on Bayesian model assessment. While Feng Liang unfortunately could not make it to JSM, Andrew Gelman and Jean-Michel Marin shared the extra-time, and Merlise Clyde gave a concluding talk that also was longer than schedule. I found it was a fantastic session with a whole range of thoughtful and provocative proposals. (I have absolutely no responsibility in the above besides inviting those speakers!) Andrew drafted a very novel picture of how Bayesian model comparison could (should?) be run, getting away from the standard paraphernalia of Bayes factors, Occam’s razor, and the like. I did not agree with the whole of his proposal, especially when he considered handling several models together with “common” parameters, but this was exciting nonetheless! Jean-Michel presented a spatial mixture model where component indicators were distributed from a Potts model and the number of components was unknown. The approximation to the posterior distribution of the number of components was based on a Chibs’ approximation. This is a complex model with an interesting solution, even though I am now waiting for the ABC comparison. Merlise concluded the session with a great summary on Bayesian model assessment, differentiating M-close from M-open cases. This was very close to my perspectives on the topic, however Merlise brought the interesting new (for me!) idea that many decision-theoretic evaluations of models would favour model averaging. One additional item that linked those three talks was that they all involved simulated pseudo-data one way or another, from posterior predictive to ABC. The session was well-attended, to the point of missing seats, especially when considering it competed with many other Bayesian sessions like the Savage Award.
After lunch, it was back to parallel computing, with the JCGS papers session. Radu Craiu gave a talk on his Raptor algorithm, somehow connected to his talk in Utah last winter. This was an interesting example of adaptive MCMC, maybe the only one I will attend at JSM. In a connected way, Timothy Hanson used Polya tree construct to build a better fitted proposal in an independent Metropolis-Hastings algorithm. The examples were quite convincing, with nice movies of recovering the true target, my worry being the limitation of the method when the dimension of the parameter increases (as usual with independent proposals). The final talk of the session was about the link between GPUs and population-based MCMC, again connected to a talk I heard earlier by Chris Holmes in Valencià 9 last year. The gains brought by using the GPUs are once again staggering!
And then the day at JSM ended with the IMS presidential address, delivered by David Cox, about his views on statistical analysis. It was a brilliant, deep, foundational, and terribly impressive talk. The huge room was packed and I ended up standing in the back, which in a sense was more appropriate for the occasion. In the talk, David Cox mentioned seven kinds of Bayesians, from subjectivists to quasi-frequentists, while he only saw two kinds of frequentists, long-term validation versus calibration… Again, a very impressive talk!
Filed under: Books, R, Statistics, Travel, University life Tagged: Bayesian Core, Bayesian model choice, JSM 2011
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.