If the population standard deviation is known to be 20, and you have a sample size of 100, what is the standard deviation of the distribution of sample means?
Answer
Correct Answer: The standard deviation of the distribution of sample means, also known as the standard error, is calculated as the population standard deviation divided by the square root of the sample size \((\sigma_{\bar{x}} = \frac{\sigma}{\sqrt{n}}\). Therefore, with a population standard deviation of 20 and a sample size of 100, the standard error would be \(20 / \sqrt{100} = 20 / 10 = 2\).