Category Archives: Information

News and updates

Just a quick post to let you all know about some recent goings-on in my research over the last couple of months. Former MSc student and current PhD student Oliver (Ollie) Stevenson was profiled on our department’s homepage. Ollie is … Continue reading

Posted in Computing, Inference, Information, Personal | Leave a comment

MaxEnt in Sivia

The book I most commonly recommend to physicists and astronomers who want an introduction to Bayesian Inference is “Data Analysis: A Bayesian Tutorial” by Sivia and Skilling. It’s a neat little book that presents things very clearly, perhaps with the exception … Continue reading

Posted in Inference, Information | 2 Comments

Hard Integrals

The “evidence” or “marginal likelihood” integral, , is often considered hard to calculate. The most general method I know of is Nested Sampling, although sometimes other methods can outperform it if the shape of the likelihood function is kind to the method. However, … Continue reading

Posted in Entropy, Inference, Information | Tagged | 1 Comment

Entropic priors are okay (but you probably shouldn’t use them)

Right now I’m in Canberra for the MaxEnt2013 meeting. It’s been an interesting and highly idiosyncratic conference so far, although I’m a bit tired because the schedule is very packed. On Wednesday, John Skilling gave an interesting presentation criticising “entropic … Continue reading

Posted in Entropy, Inference, Information | Tagged , | 3 Comments

The prior isn’t the only prior

One of my favorite pastimes is railing against certain word choices when they may imply things that aren’t true. An example of this is the usage of the word “prior” in Bayesian inference. If a quantity is unknown, then the prior describes how … Continue reading

Posted in Inference, Information | 9 Comments

Stochastic

Here is a nice Ed Jaynes quote, from his article “Probability Theory as Logic”: “Then in studying probability theory, it was vaguely troubling to see reference to “gaussian random variables”, or “stochastic processes”, or “stationary time series”, or “disorder”, as if … Continue reading

Posted in Inference, Information | 2 Comments

Data are Nuisance Parameters

In inference, the term “nuisance parameters” refers to some quantity you feel like you need to put in your model, but that you don’t actually care about. For example, you might be fitting some data with a straight line , … Continue reading

Posted in Entropy, Inference, Information | 3 Comments