the scores on a standardized test have an average of 1200 with a standard deviation of 60. a sample of 50 scores is selected. what is the probability that the sample mean will be greater than 1205? round your answer to three decimal places.

the scores on a standardized test have an average of 1200 with a standard deviation of 60 and 99.7% of the students scored between (44.67 , 56.33) i.e.

between 44 and 56.

On Subtracting 1 from your sample size, we get

100 – 1 = 99.

On Subtracting confidence level from 1, and then divide by two.

(1 – .997) / 2 = 0.0015

Now, df = 99 and α = 0.0015

from the table at df = 99 we got 2.262.

Divide your sample standard deviation by the square root of your sample size.

28 / √(100) = 2.8

Now, 2.262 × 2.8 = 6.33

So, the confidence interval be,

(100 – 6.33 , 100 + 6.33) = (44.67 , 56.33)

Hence, 99.7% of the students scored between (44.67 , 56.33) i.e.

averageof 1200 with astandard deviationof 60 and 99.7% of thestudentsscored between (44.67 , 56.33) i.e.Subtracting1 from your sample size, we getSubtractingconfidence level from 1, and then divide by two.standard deviationby the square root of your sample size.confidence intervalbe,studentsscored between (44.67 , 56.33) i.e.Statisticshere https://brainly.com/question/27165606