Capturing holographic images in extreme light-starved environments
Holograms are more than just a method of communication in science fiction films. Digital holography records interference patterns on a sensor and uses numerical processing to reconstruct an accurate, complex, 3D image. This contrasts with traditional digital imaging which requires clunky lenses and creates a flat image.
Though useful in fields ranging from environmental research to biological imaging and particle field reconstruction, holography requires sufficient illumination and exposure time. This poses a challenge for high-speed particle tracking or capturing light-sensitive biological samples vulnerable to phototoxicity.
To perform holographic imaging under these conditions, Zhang et al. developed a method for photon-starved snapshot holography requiring less than one photon per pixel.
“We made improvements to both the hardware and algorithm aspects,” said author Yunping Zhang. “Our method of photon-starved holography uses a new system setup with a quanta image sensor (QIS) and a processing algorithm called PSHoloNet. By incorporating the QIS into the inline holographic setup for the first time, we can work with holograms created by limited photons with Poisson statistics, which opens up new possibilities in holographic imaging.”
With a new, more powerful sensor and an algorithm tailored to photon-starved holography, the authors obtained results at much lower photon rates than previous holography and digital photography.
“Our method reported a sensitivity of less than one photon per pixel. By comparison, a high-quality digital camera, based on a multi-megapixel array, typically requires thousands of photons per pixel to record an image,” said Zhang.
The authors will continue improving their processing algorithms for even higher reconstruction performance, bringing us closer to digital holography under extreme conditions.
Source: “Photon-starved snapshot holography,” by Yunping Zhang, Stanley H. Chan, and Edmund Y. Lam, APL Photonics (2023). The article can be accessed at https://doi.org/10.1063/5.0145833 .