
Recent Posts
Recent Comments
G. Belanger on Videos: Intro to Probability a… Brendon J. Brewer on Much Ado About Nothing G. Belanger on Much Ado About Nothing Brendon J. Brewer on Rbased model specification fo… James Curran on Rbased model specification fo… Archives
 April 2019
 March 2019
 February 2019
 January 2019
 December 2018
 November 2018
 May 2018
 March 2018
 October 2017
 August 2017
 April 2017
 March 2017
 February 2017
 December 2016
 September 2016
 August 2016
 July 2016
 April 2016
 November 2015
 October 2015
 September 2015
 August 2015
 July 2015
 June 2015
 May 2015
 March 2015
 January 2015
 December 2014
 October 2014
 September 2014
 August 2014
 February 2014
 December 2013
 October 2013
 September 2013
 August 2013
Categories
Meta
Category Archives: Information
Much Ado About Nothing
I just met up with a new student of mine, and gave her some warmup questions to get familiar with some of the things we’ll be working on. This involved “differential entropy”, which is basically Shannon entropy but for continuous … Continue reading
Posted in Entropy, Information
2 Comments
News and updates
Just a quick post to let you all know about some recent goingson in my research over the last couple of months. Former MSc student and current PhD student Oliver (Ollie) Stevenson was profiled on our department’s homepage. Ollie is … Continue reading
Posted in Computing, Inference, Information, Personal
Leave a comment
MaxEnt in Sivia
The book I most commonly recommend to physicists and astronomers who want an introduction to Bayesian Inference is “Data Analysis: A Bayesian Tutorial” by Sivia and Skilling. It’s a neat little book that presents things very clearly, perhaps with the exception … Continue reading
Posted in Inference, Information
2 Comments
Hard Integrals
The “evidence” or “marginal likelihood” integral, , is often considered hard to calculate. The most general method I know of is Nested Sampling, although sometimes other methods can outperform it if the shape of the likelihood function is kind to the method. However, … Continue reading
Entropic priors are okay (but you probably shouldn’t use them)
Right now I’m in Canberra for the MaxEnt2013 meeting. It’s been an interesting and highly idiosyncratic conference so far, although I’m a bit tired because the schedule is very packed. On Wednesday, John Skilling gave an interesting presentation criticising “entropic … Continue reading
The prior isn’t the only prior
One of my favorite pastimes is railing against certain word choices when they may imply things that aren’t true. An example of this is the usage of the word “prior” in Bayesian inference. If a quantity is unknown, then the prior describes how … Continue reading
Posted in Inference, Information
9 Comments
Stochastic
Here is a nice Ed Jaynes quote, from his article “Probability Theory as Logic”: “Then in studying probability theory, it was vaguely troubling to see reference to “gaussian random variables”, or “stochastic processes”, or “stationary time series”, or “disorder”, as if … Continue reading
Posted in Inference, Information
3 Comments
Data are Nuisance Parameters
In inference, the term “nuisance parameters” refers to some quantity you feel like you need to put in your model, but that you don’t actually care about. For example, you might be fitting some data with a straight line , … Continue reading
Posted in Entropy, Inference, Information
3 Comments