GRE0177 #16

Post Reply
dtheili1
Posts: 5
Joined: Mon Oct 10, 2005 7:29 pm

GRE0177 #16

Post by dtheili1 » Fri Oct 14, 2005 1:35 am

Here's a error analysis question that has me stumped...

A student makes 10 1s measurements of the decay of a radioactive sample and gets the following values:
3,0,2,1,2,4,0,1,2,5

How long should the student count to establish the rate to an uncertainty of 1%?

A) 80s
B) 160s
C) 2000s
D) 5000s
E) 6400s

The answer is D. I've poured over my error analysis books, but can't get it to work. I'm sure I'm missing something simple.

Thanks for the help.

Dale

danty
Posts: 19
Joined: Fri Sep 30, 2005 6:40 pm

Post by danty » Fri Oct 14, 2005 4:28 am

This problem is quite simple to solve, if you have some experience with decay experiments.

Since the sample has a long decay life , the rate of decay or Counts/sec is constant.
The mean value of counts in a 1 second measurment is found to be <C>=2. The error (standard deviation) is DC= C^(1/2) =2^(1/2). The % uncertainty is calculated by the formula DC/C = 1/C^(1/2). So for 1% uncertainty the number of counts measured must be C=10000. So the student must measure for t=10000/2=5000 sec.

dtheili1
Posts: 5
Joined: Mon Oct 10, 2005 7:29 pm

Post by dtheili1 » Fri Oct 14, 2005 5:44 pm

OK, that makes sense. Thanks for the help.

Dale



Post Reply