Jump to content

Recommended Posts

Posted

The problem statement, all variables and given/known data
(1) An ideal digital detector only suffers from quantum noise. If, after being exposed to 5 µGy the mean pixel value in the image is 100 and the standard deviation of the pixel values in the image is 5, calculate the SNR?

The relationship between pixel value and detector dose is linear.

(2) What is the effect on SNR of applying a linear gain of factor 4 to increase all pixel values


Attempt at a solution

As I understand SNR = 100/5 = 20
(but I am not certain; it could be 100/sqrt(10)) - clarification would be helpful

Also I think gain has no effect on SNR (as it increases signal and noise by the same amount) but am not certain.

Thanks for your help

  • 1 month later...
Posted

(1) I too would say so, but I don't know the vocabulary for X-ray imaging.

(2) Gain after a detector never improves the Snr. It can degrade it if the amplifier brings noise.

(2b) Well, sometimes the amplifier also reduces the bandwidth to a more adequate value for instance, and then it depends on how you define the noise: within the local bandwidth, the useful bandwidth and so on. With a digital detector, the simple answer is expected.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.