File Download
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1117/1.AP.1.1.016004
- Scopus: eid_2-s2.0-85121459910
- WOS: WOS:000675241100010
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: End-to-end deep learning framework for digital holographic reconstruction
Title | End-to-end deep learning framework for digital holographic reconstruction |
---|---|
Authors | |
Keywords | Holograms 3D image reconstruction Reconstruction algorithms Holography Digital holography |
Issue Date | 2019 |
Publisher | SPIE - International Society for Optical Engineering. The Journal's web site is located at https://www.spiedigitallibrary.org/journals/advanced-photonics |
Citation | Advanced Photonics, 2019, v. 1 n. 1, p. article no. 016004 How to Cite? |
Abstract | Digital holography records the entire wavefront of an object, including amplitude and phase. To reconstruct the object numerically, we can backpropagate the hologram with Fresnel–Kirchhoff integral-based algorithms such as the angular spectrum method and the convolution method. Although effective, these techniques require prior knowledge, such as the object distance, the incident angle between the two beams, and the source wavelength. Undesirable zero-order and twin images have to be removed by an additional filtering operation, which is usually manual and consumes more time in off-axis configuration. In addition, for phase imaging, the phase aberration has to be compensated, and subsequently an unwrapping step is needed to recover the true object thickness. The former either requires additional hardware or strong assumptions, whereas the phase unwrapping algorithms are often sensitive to noise and distortion. Furthermore, for a multisectional object, an all-in-focus image and depth map are desired for many applications, but current approaches tend to be computationally demanding. We propose an end-to-end deep learning framework, called a holographic reconstruction network, to tackle these holographic reconstruction problems. Through this data-driven approach, we show that it is possible to reconstruct a noise-free image that does not require any prior knowledge and can handle phase imaging as well as depth map generation. |
Persistent Identifier | http://hdl.handle.net/10722/278148 |
ISSN | 2023 Impact Factor: 20.6 2023 SCImago Journal Rankings: 5.361 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | REN, Z | - |
dc.contributor.author | Xu, Z | - |
dc.contributor.author | Lam, EYM | - |
dc.date.accessioned | 2019-10-04T08:08:25Z | - |
dc.date.available | 2019-10-04T08:08:25Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | Advanced Photonics, 2019, v. 1 n. 1, p. article no. 016004 | - |
dc.identifier.issn | 2577-5421 | - |
dc.identifier.uri | http://hdl.handle.net/10722/278148 | - |
dc.description.abstract | Digital holography records the entire wavefront of an object, including amplitude and phase. To reconstruct the object numerically, we can backpropagate the hologram with Fresnel–Kirchhoff integral-based algorithms such as the angular spectrum method and the convolution method. Although effective, these techniques require prior knowledge, such as the object distance, the incident angle between the two beams, and the source wavelength. Undesirable zero-order and twin images have to be removed by an additional filtering operation, which is usually manual and consumes more time in off-axis configuration. In addition, for phase imaging, the phase aberration has to be compensated, and subsequently an unwrapping step is needed to recover the true object thickness. The former either requires additional hardware or strong assumptions, whereas the phase unwrapping algorithms are often sensitive to noise and distortion. Furthermore, for a multisectional object, an all-in-focus image and depth map are desired for many applications, but current approaches tend to be computationally demanding. We propose an end-to-end deep learning framework, called a holographic reconstruction network, to tackle these holographic reconstruction problems. Through this data-driven approach, we show that it is possible to reconstruct a noise-free image that does not require any prior knowledge and can handle phase imaging as well as depth map generation. | - |
dc.language | eng | - |
dc.publisher | SPIE - International Society for Optical Engineering. The Journal's web site is located at https://www.spiedigitallibrary.org/journals/advanced-photonics | - |
dc.relation.ispartof | Advanced Photonics | - |
dc.rights | Advanced Photonics. Copyright © SPIE - International Society for Optical Engineering. | - |
dc.rights | Copyright 2019 (year) Society of Photo‑Optical Instrumentation Engineers (SPIE). One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this publication for a fee or for commercial purposes, and modification of the contents of the publication are prohibited. This article is available online at http://dx.doi.org/10.1117/1.AP.1.1.016004 | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.subject | Holograms | - |
dc.subject | 3D image reconstruction | - |
dc.subject | Reconstruction algorithms | - |
dc.subject | Holography | - |
dc.subject | Digital holography | - |
dc.title | End-to-end deep learning framework for digital holographic reconstruction | - |
dc.type | Article | - |
dc.identifier.email | Lam, EYM: elam@eee.hku.hk | - |
dc.identifier.authority | Lam, EYM=rp00131 | - |
dc.description.nature | published_or_final_version | - |
dc.identifier.doi | 10.1117/1.AP.1.1.016004 | - |
dc.identifier.scopus | eid_2-s2.0-85121459910 | - |
dc.identifier.hkuros | 306262 | - |
dc.identifier.volume | 1 | - |
dc.identifier.issue | 1 | - |
dc.identifier.spage | article no. 016004 | - |
dc.identifier.epage | article no. 016004 | - |
dc.identifier.isi | WOS:000675241100010 | - |
dc.publisher.place | United States | - |
dc.identifier.issnl | 2577-5421 | - |