What is the maximum entropy distribution for a positive continuous variable, given its first and second moments?
For example, a Gaussian distribution is the maximum entropy distribution for an unbounded variable, given its mean and standard deviation, and a Gamma distribution is the maximum entropy distribution for a positive variable, given its mean value and the mean value of its logarithm.
One may simply use the theorem of Boltzmann’s that is in the very Wikipedia article you point to.
Note that specifying the mean and variance is equivalent to specifying the first two raw moments – each determines the other (it’s not actually necessary to invoke this, since we may apply the theorem directly to the mean and variance, it’s just a little simpler this way).
The theorem then establishes that the density must be of the form:
f(x)=cexp(λ1x+λ2x2) for all x≥0
Integrability over the positive real line will restrict λ2 to be ≤0, and I think places some restrictions on the relationships between the λs (which will presumably be satisfied automatically when starting from the specified mean and variance rather than the raw moments).
To my surprise (since I wouldn’t have expected it when I started this answer), this appears to leave us with a truncated normal distribution.
As it happens, I don’t think I’ve used this theorem before, so criticisms or helpful suggestions on anything I haven’t considered or have left out would be welcome.