Is it possible for a distribution to have known variance but unknown mean?

Many tutorials demonstrate problems where the objective is to estimate a confidence interval of the mean for a distribution with known variance but unknown mean.

I have trouble understanding how the mean would be unknown when the variance is known since the formula for the variance assumes knowledge of the mean.

If this is possible, could you provide a real life example?

Answer

A practical example: suppose I have a thermometer and want to build a picture of how accurate it is. I test it at a wide variety of different known temperatures, and empirically discover that if the true temperature is $T$ then the temperature that the thermometer displays is approximately normally distributed with mean $T$ and standard deviation 1 degree Kelvin (and that different readings at the same true temperature are independent). I then take a reading from the thermometer in a room in which the true temperature is unknown. It would be reasonable to model the distribution of the reading as normal with known standard deviation (1) and unknown mean.

Similar examples could arise with other measuring tools.

Attribution
Source : Link , Question Author : Jayaram Iyer , Answer Author : fblundun

Leave a Comment