Page 1 of 1

GRE0177 #16

Posted: Fri Oct 14, 2005 1:35 am
by dtheili1
Here's a error analysis question that has me stumped...

A student makes 10 1s measurements of the decay of a radioactive sample and gets the following values:
3,0,2,1,2,4,0,1,2,5

How long should the student count to establish the rate to an uncertainty of 1%?

A) 80s
B) 160s
C) 2000s
D) 5000s
E) 6400s

The answer is D. I've poured over my error analysis books, but can't get it to work. I'm sure I'm missing something simple.

Thanks for the help.

Dale

Posted: Fri Oct 14, 2005 4:28 am
by danty
This problem is quite simple to solve, if you have some experience with decay experiments.

Since the sample has a long decay life , the rate of decay or Counts/sec is constant.
The mean value of counts in a 1 second measurment is found to be <C>=2. The error (standard deviation) is DC= C^(1/2) =2^(1/2). The % uncertainty is calculated by the formula DC/C = 1/C^(1/2). So for 1% uncertainty the number of counts measured must be C=10000. So the student must measure for t=10000/2=5000 sec.

Posted: Fri Oct 14, 2005 5:44 pm
by dtheili1
OK, that makes sense. Thanks for the help.

Dale