# Homework Help: SNR in X-ray imaging

1. Feb 20, 2016

### BobP

1. The problem statement, all variables and given/known data
(1) An ideal digital detector only suffers from quantum noise. If, after being exposed to 5 µGy the mean pixel value in the image is 100 and the standard deviation of the pixel values in the image is 5, calculate the SNR?

The relationship between pixel value and detector dose is linear.

(2) What is the effect on SNR of applying a linear gain of factor 4 to increase all pixel values

2. Relevant equations

3. The attempt at a solution

As I understand SNR = N/sqrt(N) so I would have said SNR = 10
But I don't understand why they gave the S.D value of 5...

Re part (2) I thought SNR was unaffected by gain as both noise and signal would increase by the same amount

Thanks for the help

2. Feb 25, 2016

### Greg Bernhardt

Thanks for the post! This is an automated courtesy bump. Sorry you aren't generating responses at the moment. Do you have any further information, come to any new conclusions or is it possible to reword the post?

3. Feb 25, 2016

### haruspex

SNR is defined in different ways for different purposes. I can't think of a purpose for which it would take the form N/√N. (Why doesn't that collapse to √N? Are the two Ns different?)
The form μ/σ given at https://en.m.wikipedia.org/wiki/Signal-to-noise_ratio#Alternative_definition looks appropriate.
Of course, for a given arrangement, if you just vary the sample size N then you will get something proportional to √N, but not simply equal to it.