Tag Archives: entropy

Entropic priors are okay (but you probably shouldn’t use them)

Right now I’m in Canberra for the MaxEnt2013 meeting. It’s been an interesting and highly idiosyncratic conference so far, although I’m a bit tired because the schedule is very packed. On Wednesday, John Skilling gave an interesting presentation criticising “entropic … Continue reading

Posted in Entropy, Inference, Information | Tagged , | 3 Comments