Category Archives: Entropy

Gains from trade versus(?) subjective wellbeing

This year I’ve been learning basic economics. It’s a cool subject. One interesting concept is “gains from trade”. The idea is that a person probably only participates in a trade if they think they’d benefit from it. If two parties … Continue reading

Posted in Economics, Entropy, Inference | 3 Comments

Hard Integrals

The “evidence” or “marginal likelihood” integral, , is often considered hard to calculate. The most general method I know of is Nested Sampling, although sometimes other methods can outperform it if the shape of the likelihood function is kind to the method. However, … Continue reading

Posted in Entropy, Inference, Information | Tagged | 1 Comment

Entropic priors are okay (but you probably shouldn’t use them)

Right now I’m in Canberra for the MaxEnt2013 meeting. It’s been an interesting and highly idiosyncratic conference so far, although I’m a bit tired because the schedule is very packed. On Wednesday, John Skilling gave an interesting presentation criticising “entropic … Continue reading

Posted in Entropy, Inference, Information | Tagged , | 2 Comments

Entropy in Vocal Technique

One of my favorite hobbies is singing. I never had any talent for it, but through a lot of persistence and study I’ve been able to progress to a level where I’m comfortable calling myself intermediate. As part of this … Continue reading

Posted in Entropy, Singing | Leave a comment

Data are Nuisance Parameters

In inference, the term “nuisance parameters” refers to some quantity you feel like you need to put in your model, but that you don’t actually care about. For example, you might be fitting some data with a straight line , … Continue reading

Posted in Entropy, Inference, Information | 3 Comments