The facts of the question are that a fast food chain claims their regular hamburgers have an average of 310 calories. One consumer believes this average is actually much higher and takes a random sample of 45 hamburgers. The mean of this sample is x-bar=314 and standard deviation is s=27. If I am asked to compute the standard error of the sample mean, x-bar then I should use the formula of ( s/ sq root of n ). Is that the correct formula and would I then be computing (27/ sq root of 45)?

"Get 15% discount on your first 3 orders with us"
Use the following coupon
FIRST15

Order Now