# The Naked Scientists Forum

#### Boogie

• Full Member
• Posts: 63
« on: 22/08/2012 20:43:44 »
I hope I put this in the right place.

I'm no mathematician or nuclear engineer, so please bare with me. Am I calculating this right?

Let's say, for example, a radiation monitor has an average background of 1872 CPS. During a scan the maximum count is say 2143 CPS. How much of a sigma increase is that?

Is this correct? :

Delta = MaxScan - AverageBkg = 2143-1872 = 271
Sigma = Delta/SQRT(AverageBkg) = 271/SQRT(1872) = 6.263 sigma increase

Searching the internet for a confirmation of this calculation gets over my head rather quickly. If the above calculation is wrong, could someone please tell me what the proper process is without getting all long haired about it?

Thanks much!

#### distimpson

• Sr. Member
• Posts: 118
##### Re: Calculating radiation sigma levels
« Reply #1 on: 23/08/2012 00:27:02 »
looks pretty good. I'm guessing the lesson deals with the Poisson distribution as it applies to radiation counts and the important point that the variance (sigma squared) of this distribution equals the mean. Hence, the standard deviation sigma is the square root of the mean as you have calculated.

#### Boogie

• Full Member
• Posts: 63
##### Re: Calculating radiation sigma levels
« Reply #2 on: 23/08/2012 00:37:45 »
Awesome! Thank you very much for the confirmation and breif details. :)

#### The Naked Scientists Forum

##### Re: Calculating radiation sigma levels
« Reply #2 on: 23/08/2012 00:37:45 »