How many commuters must be randomly selected to estimate the mean driving time of Chicago commuters? We want 95% confidence that the sample mean is within 3 minutes of the population mean, and the population standard deviation is known to be 12 minutes.

Mathematics · College · Thu Feb 04 2021

Answered on

 To estimate the mean driving time of Chicago commuters with a specified level of confidence and margin of error, we will use the formula for the sample size of a population with a known standard deviation when estimating a mean. This is derived from the formula for a confidence interval for a population mean:

n = (Z * σ / E)^2

Where: - n is the sample size - Z is the Z-score corresponding to the desired confidence level - σ (sigma) is the population standard deviation - E is the margin of error (the maximum difference between the sample mean and the population mean that you're willing to accept)

For a 95% confidence level, we need to find the corresponding Z-score. The Z-score that corresponds to a 95% confidence level is approximately 1.96 (you can find this value in Z-tables or using statistical software).

Given that the population standard deviation (σ) is 12 minutes and the desired margin of error (E) is 3 minutes, we can now plug these values into the formula:

n = (1.96 * 12 / 3)^2 n = (1.96 * 4)^2 n = (7.84)^2 n = 61.47

Since we can't have a fraction of a person, we always round up to the next whole number when it comes to sample size. Therefore, you would need to randomly select 62 commuters to estimate the mean driving time of Chicago commuters with 95% confidence that the sample mean is within 3 minutes of the population mean.