I'm trying to figure out a way to calibrate the intensity response of a fiber optic spectrometer (Ocean Optics Red Tide - 350-1000nm). Let's assume that the wavelength calibration is accurate. The problem is that each pixel of the detector array (corresponding to each different wavelength) has a different response - call it A(lambda). So when a source with a spectral irradiance of B(lambda) is input to the spectrometer the measured output is A(lambda)*B(lambda). Note that this is no problem for transmission measurements since in that case a ratio of responses is measured and the A(lambda) factor divides out. I, however, would like to measure the spectral irradiance of some light sources, which is a single measurement. So I need to measure the A(lambda) factor to calibrate it out.
Ordinarily this is accomplished using a calibrated light source of known spectral irradiance. Pop it in, measure the output, divide by the known spectral irradiance - done. But I don't have one, they cost a bunch of money, and it's an interesting exercise to see if I can figure out how to do this without one.
My current favorite approach is to use the solar spectrum as a reference. Lots of pretty accurate measurements of the solar spectrum exist and I have access to the ASTM reference spectrum, so in principle I just need to point the spectrometer at the sun and go for it. Two problems: First, way too much light - I would saturate and probably damage the detector if I just pointed it directly at the sun, and any sort of filter used to attenuate the light introduces its own spectral transmittance that messes up the results. In principle I guess I could measure the spectral transmittance of a stack of filters of sufficient optical density and the calibrate that out, but that seems like a lot of measurements with errors accumulating at each step. Also there is problem 2: The sun and the sky have different spectra. Both are tabulated, but the two must be separated to get an accurate calibration.
So here's my approach addressing both these issues. Use a pinhole camera to image the solar disk onto the spectrometer input port. A pinhole shouldn't introduce any spectral transmission artifacts into the solar spectrum. Furthermore, by imaging the solar disk I should isolate the solar spectrum from the blue sky background. Oh yeah, using a 300um pinhole (about the smallest I can reliably produce) and 1 meter path length I should be able to attenuate the beam by about a factor of 1000. Hopefully that's enough. If not I can buy smaller pinholes pretty cheaply and bring the attenuation up to 4-5 orders of magnitude, which definitely would be.
Another, more pedestrian, but possibly more reliable approach is to use a Silicon photodiode and a monochromator (which I also have lying around the home/lab). The spectral responsivity of Silicon is well characterized so if I just tune the monochromator to every wavelength, measure the response of the photodiode and divide by the responsivity I should have a measure of the relative irradiance at every wavelength coming out of the monochromator. This is a lot more laborious than the solar method, but would probably work well enough. Any calibrated optical power meter would do the job here, but I don't have one (I know, he owns a spectrometer and a monochromator, but no power meter?). Si photodiodes are cheap and ubiquitous.
Anyway, I'm curious as to opinions on these approaches. Any 'gotchas" I should be wary of? Any alternate approached I should consider?
Any information/opinions appreciated.
Thanks,
RBD (the old laser guy)