The manager of a drive-thru restaurant must estimate the mean time (in minutes) it takes the cashiers to handle customer orders. Historically, the time required has followed a normal distribution. A random sample of 10 cashiers was taken and based on the random sample of 99.8% CI was obtained as (0.8905, 3.1095).

(1) What was x-bar for the sample of 10 cashiers?

(2) What is the error margin (e)?

This is a different type of population mean estimation problem that I’m not used to and am not quite sure how to solve. I have tried what I thought might work but end up nowhere near the solution for either. Could someone please show me the steps for how to solve this?

The answers are:

(1) 2

(2) 1.1095

"Get 15% discount on your first 3 orders with us"
Use the following coupon
FIRST15

Order Now