Why are mean 0 and standard deviation 1 distributions always used?

My stats has been self taught, but a lot of material I read point to a dataset having mean 0 and standard deviation of 1.

If that is the case then:

  1. Why is mean 0 and SD 1 a nice property to have?

  2. Why does a random variable drawn from this sample equal 0.5? The chance of drawing 0.001 is the same as 0.5 so this should be flat distribution…

  3. When people talk about Z Scores what do they actually mean here?

Answer

  1. At the beginning the most useful answer is probably that mean of 0 and sd of 1 are mathematically convenient. If you can work out the probabilities for a distribution with a mean of 0 and standard deviation of 1 you can work them out for any similar distribution of scores with a very simple equation.

  2. I’m not following this question. The mean of 0 and standard deviation of 1 usually applies to the standard normal distribution, often called the bell curve. The most likely value is the mean and it falls off as you get farther away. If you have a truly flat distribution then there is no value more likely than another. Your question here is poorly formed. Were you looking at questions about coin flips perhaps? Look up binomial distribution and central limit theorem.

  3. “mean here”? Where? The simple answer for z-scores is that they are your scores scaled as if your mean were 0 and standard deviation were 1. Another way of thinking about it is that it takes an individual score as the number of standard deviations that score is from the mean. The equation is calculating the (score – mean) / standard deviation. The reasons you’d do that are quite varied but one is that in intro statistics courses you have tables of probabilities for different z-scores (see answer 1).

If you looked up z-score first, even in wikipedia, you would have gotten pretty good answers.

Attribution
Source : Link , Question Author : Community , Answer Author :
2 revs, 2 users 80%

Leave a Comment