jsmith613 Posted February 21, 2016 Posted February 21, 2016 The problem statement, all variables and given/known data(1) An ideal digital detector only suffers from quantum noise. If, after being exposed to 5 µGy the mean pixel value in the image is 100 and the standard deviation of the pixel values in the image is 5, calculate the SNR?The relationship between pixel value and detector dose is linear.(2) What is the effect on SNR of applying a linear gain of factor 4 to increase all pixel valuesAttempt at a solutionAs I understand SNR = 100/5 = 20(but I am not certain; it could be 100/sqrt(10)) - clarification would be helpfulAlso I think gain has no effect on SNR (as it increases signal and noise by the same amount) but am not certain.Thanks for your help
Enthalpy Posted April 6, 2016 Posted April 6, 2016 (1) I too would say so, but I don't know the vocabulary for X-ray imaging. (2) Gain after a detector never improves the Snr. It can degrade it if the amplifier brings noise. (2b) Well, sometimes the amplifier also reduces the bandwidth to a more adequate value for instance, and then it depends on how you define the noise: within the local bandwidth, the useful bandwidth and so on. With a digital detector, the simple answer is expected.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now