EdEarl Posted July 14, 2016 Posted July 14, 2016 phys.org Researchers have developed an algorithm that is so efficient that it can generate high-quality 3D images using a single-photon camera that detects just one signal photon per pixel. A single-photon camera then captures the light reflected by the illuminated objects, along with some background light that the researchers added to simulate a realistic environment. The key element of the camera is a single-photon avalanche diode (SPAD) array, which detects incoming photons and records their precise times of arrival. IDK how sensitive current cameras are in telescopes, but it if this one is better, then images can be captured faster, making more time for other observations, or fainter objects can be captured with long times, making it possible to see things that cannot currently be seen. Since the time photons arrive is captured, will that change the nature of optical interferometry? Will it make arrays of smaller telescopes practical, instead of building massive telescopes like the ELT?
imatfaal Posted July 14, 2016 Posted July 14, 2016 I would have thought that you would struggle with the idea of localization of a photon - ie you cannot sharply localize a photon
swansont Posted July 14, 2016 Posted July 14, 2016 I would have thought that you would struggle with the idea of localization of a photon - ie you cannot sharply localize a photon Tough to localize while in transit, but not so much while interacting. Photons get absorbed by individual atoms. Pixels are relatively large if you have a 1cm x 1cm chip with 100 megapixels, that's a square micron each.
EdEarl Posted July 14, 2016 Author Posted July 14, 2016 CCDs can detect single photons at some wavelengths with near 100% efficiency. http://www.andor.com/learning-academy/quantum-efficiency-(qe)-in-high-energy-ccd-detectors-understand-qe-in-a-high-energy-ccd
Recommended Posts