
Recent Posts
Recent Comments
Blogs I Follow
 jtobin.io
 Hypergeometric
 Another Astrostatistics Blog
 Happiness, truth, everything
 A Fortunate Universe
 Quillette
 Publishable Stuff
 Anna Pancoast
 Thermodynamics For Everyone
 The EtzFiles
 Bayesian Philosophy
 Offsetting Behaviour
 The World According to Renee
 The Vegan Strategist
 Xi'an's Og
 It's chancy.
 Letters to Nature
 Plausibility Theory
 Skeptic Pete's Blog
 Truth, Beauty and a Picture of You
Archives
 February 2017
 December 2016
 September 2016
 August 2016
 July 2016
 April 2016
 November 2015
 October 2015
 September 2015
 August 2015
 July 2015
 June 2015
 May 2015
 March 2015
 January 2015
 December 2014
 October 2014
 September 2014
 August 2014
 February 2014
 December 2013
 October 2013
 September 2013
 August 2013
Categories
Meta
Category Archives: Information
MaxEnt in Sivia
The book I most commonly recommend to physicists and astronomers who want an introduction to Bayesian Inference is “Data Analysis: A Bayesian Tutorial” by Sivia and Skilling. It’s a neat little book that presents things very clearly, perhaps with the exception … Continue reading
Posted in Inference, Information
2 Comments
Hard Integrals
The “evidence” or “marginal likelihood” integral, , is often considered hard to calculate. The most general method I know of is Nested Sampling, although sometimes other methods can outperform it if the shape of the likelihood function is kind to the method. However, … Continue reading
Entropic priors are okay (but you probably shouldn’t use them)
Right now I’m in Canberra for the MaxEnt2013 meeting. It’s been an interesting and highly idiosyncratic conference so far, although I’m a bit tired because the schedule is very packed. On Wednesday, John Skilling gave an interesting presentation criticising “entropic … Continue reading
The prior isn’t the only prior
One of my favorite pastimes is railing against certain word choices when they may imply things that aren’t true. An example of this is the usage of the word “prior” in Bayesian inference. If a quantity is unknown, then the prior describes how … Continue reading
Posted in Inference, Information
9 Comments
Stochastic
Here is a nice Ed Jaynes quote, from his article “Probability Theory as Logic”: “Then in studying probability theory, it was vaguely troubling to see reference to “gaussian random variables”, or “stochastic processes”, or “stationary time series”, or “disorder”, as if … Continue reading
Posted in Inference, Information
2 Comments
Data are Nuisance Parameters
In inference, the term “nuisance parameters” refers to some quantity you feel like you need to put in your model, but that you don’t actually care about. For example, you might be fitting some data with a straight line , … Continue reading
Posted in Entropy, Inference, Information
3 Comments