I. INTRODUCTION
Section:
ChooseTop of pageABSTRACTI. INTRODUCTION <<II. PARTIALLY-COHERENT BR...III. COMPRESSION OF COMPU...IV. RESULTS AND IMPLICATI...V. CONCLUSIONSUPPLEMENTARY MATERIALREFERENCESOptical diffraction tomography is the premier label-free method to produce maps of refractive index within optically thick volumetric samples, typically in transmission11. D. Jin, R. Zhou, Z. Yaqoob, and P. T. C. So, “Tomographic phase microscopy: Principles and applications in bioimaging [invited],” J. Opt. Soc. Am. B 34, B64 (2017). https://doi.org/10.1364/josab.34.000b64 but more recently also in epi-mode.22. P. Ledwig and F. E. Robles, “Quantitative 3D refractive index tomography of opaque samples in epi-mode,” Optica 8, 6 (2021). https://doi.org/10.1364/optica.410135 A spatially coherent source is typically employed, and computational tomographic techniques can reconstruct a 3D map of phase and absorption (or a complex index of refraction) from a diverse variety of incident angles while projecting through a large number of focal planes at once. Partially coherent methods take an alternate approach by illuminating with a large number of incoherent incident angles at once, but only capturing the transmissive effects of one focal plane at a time, which is isolated due to the reduced coherence volume produced by the broad angular distribution of the light source. This allows the production of 3D maps of the index of refraction with quantitative accuracy using simple z-stage actuation as the only moving element, and reconstruction to be performed with a computationally inexpensive direct deconvolution.33. N. Streibl, “Depth transfer by an imaging system,” Opt. Acta: Int. J. Opt. 31, 1233–1241 (1984). https://doi.org/10.1080/713821435By deconvolving an imaging system’s optical transfer function (OTF) from a captured image, it is possible, in principle, to recover the phase and absorption of unlabeled specimens given only a knowledge of the imaging system’s acceptance pupil and source spatial and temporal frequency distribution.44. H. H. Hopkins, “On the diffraction theory of optical images,” Proc. R. Soc. A 217, 408–432 (1953). https://doi.org/10.1098/rspa.1953.0071 While producing an OTF for an axially-symmetric, monochromatic source can be performed rapidly by reducing dimensions,5,65. C. J. R. Sheppard, “Partially coherent microscope imaging system in phase space: Effect of defocus and phase reconstruction,” J. Opt. Soc. Am. A 35, 1846 (2018). https://doi.org/10.1364/josaa.35.0018466. J. Huang, Y. Bao, and T. K. Gaylord, “Three-dimensional phase optical transfer function in axially symmetric microscopic quantitative phase imaging,” J. Opt. Soc. Am. A 37, 1857 (2020). https://doi.org/10.1364/josaa.403861 doing so with numerical accuracy for asymmetrical cases can be computationally expensive and infeasible for broadband sources. This limitation restricts the practical applicability of quantitative analysis to a limited set of well-controlled illumination parameters, such as annular phase-contrast7,87. F. Zernike, “How I discovered phase contrast,” Science 121, 345–349 (1955). https://doi.org/10.1126/science.121.3141.3458. M. H. Jenkins and T. K. Gaylord, “Quantitative phase microscopy via optimized inversion of the phase optical transfer function,” Appl. Opt. 54, 8566 (2015). https://doi.org/10.1364/ao.54.008566 or discrete collection of LEDs.99. D. K. Hamilton and C. J. R. Sheppard, “Differential phase contrast in scanning optical microscopy,” J. Microsc. 133, 27–39 (1984). https://doi.org/10.1111/j.1365-2818.1984.tb00460.x Therefore, available technologies and theoretical frameworks fail to take advantage of widely available, low-cost partially coherent light sources that emit light over a broad range of wavelengths (e.g., lamps, LEDs, and thermal emitters) and that hold distinct advantages for microscopy over single-wavelength sources, such as improved spatial frequency support1010. Y. Choi, P. Hosseini, J. W. Kang, S. Kang, T. D. Yang, M. G. Hyeon, B.-M. Kim, P. T. C. So, and Z. Yaqoob, “Reflection phase microscopy using spatio-temporal coherence of light,” Optica 5, 1468 (2018). https://doi.org/10.1364/optica.5.001468 and temporal coherence gating.11,1211. H. Nguyen, M. E. Kandel, M. Rubessa, M. B. Wheeler, and G. Popescu, “Gradient light interference microscopy for 3D imaging of unlabeled specimens,” Nat. Commun. 8, 210 (2017). https://doi.org/10.1038/s41467-017-00190-712. A. Dubois, L. Vabre, A.-C. Boccara, and E. Beaurepaire, “High-resolution full-field optical coherence tomography with a Linnik microscope,” Appl. Opt. 41, 805–812 (2002). https://doi.org/10.1364/ao.41.000805 Sources that vary continuously or asymmetrically with angle and/or wavelength, such as scattered light through a diffusing medium,22. P. Ledwig and F. E. Robles, “Quantitative 3D refractive index tomography of opaque samples in epi-mode,” Optica 8, 6 (2021). https://doi.org/10.1364/optica.410135 demand a more general treatment of the problem.While partially coherent tomography typically achieves depth sectioning by making use of spatial coherence gating enabled by a broad angular spectrum, combining the effects of spatial and temporal coherence gating has been the motivation behind a number of recent tomographic technologies, including full-field optical coherence tomography,1212. A. Dubois, L. Vabre, A.-C. Boccara, and E. Beaurepaire, “High-resolution full-field optical coherence tomography with a Linnik microscope,” Appl. Opt. 41, 805–812 (2002). https://doi.org/10.1364/ao.41.000805 white light optical diffraction tomography,1313. T. Kim, R. Zhou, M. Mir, S. D. Babacan, P. S. Carney, L. L. Goddard, and G. Popescu, “White-light diffraction tomography of unlabelled live cells,” Nat. Photonics 8, 256–263 (2014). https://doi.org/10.1038/nphoton.2013.350 and reflection phase microscopy.1010. Y. Choi, P. Hosseini, J. W. Kang, S. Kang, T. D. Yang, M. G. Hyeon, B.-M. Kim, P. T. C. So, and Z. Yaqoob, “Reflection phase microscopy using spatio-temporal coherence of light,” Optica 5, 1468 (2018). https://doi.org/10.1364/optica.5.001468 The theoretical development presented in these works is tailored to their specific applications, relying on symmetries (such as axial symmetry) or sparsity (such as a discrete LED array) in illumination pattern and spectrum to produce a succinct expression for the relationship between the images formed and the object under investigation. However, many label-free techniques such as differential interference contrast,1414. M. R. Arnison, K. G. Larkin, C. J. R. Sheppard, N. I. Smith, and C. J. Cogswell, “Linear phase imaging using differential interference contrast microscopy,” J. Microsc. 214, 7–12 (2004). https://doi.org/10.1111/j.0022-2720.2004.01293.x differential phase contrast,1515. L. Tian and L. Waller, “3D intensity and phase imaging from light field measurements in an LED array microscope,” Optica 2, 104 (2015). https://doi.org/10.1364/optica.2.000104 and quantitative oblique back illumination2,162. P. Ledwig and F. E. Robles, “Quantitative 3D refractive index tomography of opaque samples in epi-mode,” Optica 8, 6 (2021). https://doi.org/10.1364/optica.41013516. P. Ledwig and F. E. Robles, “Epi-mode tomographic quantitative phase imaging in thick scattering samples,” Biomed. Opt. Express 10, 3605 (2019). https://doi.org/10.1364/boe.10.003605 rely on continuous asymmetric illumination patterns to provide phase contrast.In this work, we demonstrate a compact mutual coherence propagation kernel and employ it toward producing an efficient computational method to produce an OTF for sources with arbitrary spectral and angular power spectra. By analytically finding the set of broadband source wave vectors that can contribute to a given spatial frequency in a 3D intensity microscope image beforehand, we can compress the computation of optical transfer functions for practically any available source, streamlining computational image processing, and expanding the breadth of quantitative experimental techniques available to optical investigators.
II. PARTIALLY-COHERENT BROADBAND 3D OPTICAL TRANSFER FUNCTION
Section:
ChooseTop of pageABSTRACTI. INTRODUCTIONII. PARTIALLY-COHERENT BR... <<III. COMPRESSION OF COMPU...IV. RESULTS AND IMPLICATI...V. CONCLUSIONSUPPLEMENTARY MATERIALREFERENCESTo form a theoretical justification for this method, we start with an examination of the local state of coherence of a broadband field due to the incoherent contributions of emitting objects at a sufficient distance. As the second-order correlations of a stationary field themselves evolve according to coupled wave equations analogously to the field itself,1717. E. Wolf, “A macroscopic theory of interference and diffraction of light from finite sources II. Fields with a spectral range of arbitrary width,” Proc. R. Soc. A 230, 246–265 (1955). https://doi.org/10.1098/rspa.1955.0127 spatial and temporal coherence are, therefore, fundamentally linked by laws of propagation1818. L. Mandel and E. Wolf, Optical Coherence and Quantum Optics (Cambridge University Press, 1995), Chap. 4, this notion is treated broadly esp. in Sec. 4.4 and 4.6, but is succinctly summarized in Sec. 4.3.1, pg. 169 “However, only in very simple cases can one sharply distinguish between temporal and spatial coherence. In general these two types of coherence phenomena are not independent of each other, since …the dependence of the mutual coherence function Γ(r1, r2, τ) on the position variables r1 and r2 and on the temporal variable τ is coupled.” (c.f. Sec. I of the supplementary material). It is through their mutual contributions that we are able to arrive at a concise formula for spatiotemporal coherence.By constructing the mutual coherence within a spherical geometry, we can show that a surrounding volumetric source of arbitrary physical dimensions extends radially away from the near-field, and the object-space representation conveniently transforms into the 3D k-space representation of the field. This assumes a wide-sense stationary spatially incoherent volumetric source that is at a long distance from the observation region relative to both the maximum wavelength and the dimensions of the region of observation. In widefield microscopy, these conditions are met when illuminating with a source of any state of temporal coherence in the conjugate plane of the sample, the standard configuration for modern widefield microscopy,1919. A. Köhler, “New method of illumination for photomicrographical purposes,” J. R. Microsc. Soc. 14, 261–262 (1894). or, additionally, with a temporally incoherent source scattered through a diffusing volume that may be near the sample.1616. P. Ledwig and F. E. Robles, “Epi-mode tomographic quantitative phase imaging in thick scattering samples,” Biomed. Opt. Express 10, 3605 (2019). https://doi.org/10.1364/boe.10.003605 More general sources, such as temporally coherent light scattered through a static diffuser, may require further treatment, such as that provided in Sec. II of the supplementary material for certain Schell-model sources.2020. A. C. Schell, “The multiple plate antenna,” Ph.D. thesis, Massachusetts Institute of Technology, 1961.As shown in Sec. I of the supplementary material, a compact formulation of the source-integrated radiant spectral density of light impinging on the target object in a microscope can be found by propagating to the far-field in a spherical geometry, projecting the source distribution onto the unit sphere, and mapping temporal frequency onto the radial dimension in the 3D spherical reciprocal space u⃗, giving rise to the following 3D broadband analog to the van Cittert–Zernike theorem,W(x⃗,ν)=∫S(u⃗)ei2πx⃗⋅u⃗δ(|u⃗|−ν)d3u⃗.(1)Here, x represents 3D coordinates at the object, and u represents a 3D Fourier plane-wave space where the radial direction û maps to the direction of propagation. Notably, the radiant cross-spectral density, W (Fourier pair with the mutual coherence, Γ), depends only on three spatial dimensions (instead of six) due to a shift invariance in the local reference frame. In addition, due to the stationary statistics of the problem, we scale time by the speed of light and, therefore, represent frequency in units of inverse distance as linear wavenumber ν=k2π (which we will refer to as wavenumber), to which we map the radius in reciprocal space |u|. The term S(u⃗) represents the spatiotemporal spectral intensity of the source, describing its radiant power as a function of direction and wavenumber in the 3D Fourier space u, and is defined in detail in the supplementary material in Sec. I. We arrive at the result that the angular spectrum of the source spectral intensity S(u⃗) contains sufficient information to give rise to the cross-spectral density of the illuminating light at the target, with a delta function isolating the interaction of individual temporal frequencies (ν). Equation (1) expresses the propagation of mutual coherence for a broadband source in a succinct expression, is evaluated with a single volumetric integral that resembles a frequency-sifted 3D Fourier transform, and forms the conceptual basis for the treatment presented in the rest of this work. Here and in the supplementary material, we define and apply this relationship in the context of a general imaging system, allowing us to produce a general-purpose 3D OTF for systems with arbitrary spatiotemporal source spectra and to compute them efficiently.By taking this approach, we propagate a cross-spectral density with wavelength selection imposed by the delta function in Eq. (1). Similarly, we can propagate arbitrary fields from the object out to the pupil plane and then again to the camera, while maintaining a concise form for the equation for the intensity image formed at the camera of a microscopic imaging system. Further details are offered in Sec. III of the supplementary material. By making use of the same line of reasoning due to propagation in a spherical geometry as was used with the treatment of the mutual coherence [see Fig. 1(a)], we produce a 3D broadband analog of the bilinear Hopkins’ equation for image intensity,Ic̃(q⃗)=∬Su⃗Op⃗+q⃗2O*p⃗−q⃗2Pu⃗+p⃗+q⃗2×P*u⃗+p⃗−q⃗2δ|u⃗|−|u⃗+p⃗+q⃗2|×δ|u⃗|−|u⃗+p⃗−q⃗2|d3u⃗d3p⃗.(2)Here, intensity at the camera Ic̃ is evaluated in 3D Fourier space q⃗ as a convolution of the object transmission spectrum O with a filtering quantity formed by the convolution of the source (S), 3D pupil functions (P), and sampling delta functions over the source spatial frequency (u⃗). As shown in the supplementary material, Sec. III, the explicit dependence on ν vanishes as it is naturally incorporated into the outward propagating fields p⃗. In addition, while biological samples are generally weakly absorbing and minimally dispersive in the visible and near infrared regions of the spectrum,2121. S. L. Jacques, “Optical properties of biological tissues: A review,” Phys. Med. Biol. 58, R37 (2013). https://doi.org/10.1088/0031-9155/58/11/r37 the theory can be adapted to accommodate variation due to dispersion or variable absorption in situations where it may apply. Here we only consider the former case and leave the latter for future work.This is only an intermediate result, but it is important to note the difference between Eq. (2) and the corresponding quasi-monochromatic counterpart well-established in the literature.2222. C. J. R. Sheppard and X. Q. Mao, “Three-dimensional imaging in a microscope,” J. Opt. Soc. Am. A 6, 1260 (1989). https://doi.org/10.1364/josaa.6.001260 The latter was developed with the implication that the results may be extended to a scenario with a broadband source by re-computing at each desired wavelength and integrating over the desired source spectrum.2323. C. J. R. Sheppard and T. Wilson, “Image Formation in scanning microscopes with partially coherent source and detector,” Opt. Acta: Int. J. Opt. 25, 315–325 (1978). https://doi.org/10.1080/713819784 While this approach is clearly valid due to superposition, it may be computationally difficult or infeasible for certain non-reducible experimental conditions. By incorporating a more generalized broadband source from the inception, we find that the resulting intensity integral is more concise in its formalism and comprehensible in its description, allowing for the important insights which gave rise to the compressed computational approach described later in this work.While Eq. (2) is of interest to phase-space optics research,2424. S. B. Mehta and C. J. R. Sheppard, “Partially coherent microscope in phase space,” J. Opt. Soc. Am. A 35, 1272 (2018). https://doi.org/10.1364/josaa.35.001272 it is computationally dense, and for the purposes of this work, it is an intermediate result that leads to the production of a 3D OTF, given by Eq. (3) (see the supplementary material, Sec. III, for the derivation). This result mirrors the formulation in the landmark work by Streibl,33. N. Streibl, “Depth transfer by an imaging system,” Opt. Acta: Int. J. Opt. 31, 1233–1241 (1984). https://doi.org/10.1080/713821435 but once again with the implicit inclusion of an arbitrary source bandwidth,T̃α/ϕ(q⃗)=∫S(u⃗)Pu⃗−q⃗2δ|u⃗|−|u⃗−q⃗|±Pu⃗+q⃗2δ*|u⃗+q⃗|−|u⃗|d3u⃗.(3)Here, the subscripts α and ϕ refer to the absorption and phase signals, respectively. The arguments of the delta functions each describe a plane in u⃗ with an orthogonal vector ±q⃗/2. This non-paraxial 3D phase and amplitude transfer function is similar to those reported in the literature,2,82. P. Ledwig and F. E. Robles, “Quantitative 3D refractive index tomography of opaque samples in epi-mode,” Optica 8, 6 (2021). https://doi.org/10.1364/optica.4101358. M. H. Jenkins and T. K. Gaylord, “Quantitative phase microscopy via optimized inversion of the phase optical transfer function,” Appl. Opt. 54, 8566 (2015). https://doi.org/10.1364/ao.54.008566 with the key distinctions being the implicit extension to a broad spectrum and consequently the form of the argument of the sifting delta functions. In addition, the assumption here is that the object is slowly varying rather than weakly reflecting. The former, in practice, provides a transfer function that is nearly equivalent to that produced by the weak object assumption but additionally allows the consideration of dark-field illumination terms, where source numerical apertures (NAs) exceed the pupil NA, and is based on the Rytov approximation, which is considered to be more valid in biological samples.5,255. C. J. R. Sheppard, “Partially coherent microscope imaging system in phase space: Effect of defocus and phase reconstruction,” J. Opt. Soc. Am. A 35, 1846 (2018). https://doi.org/10.1364/josaa.35.00184625. M. H. Jenkins and T. K. Gaylord, “Three-dimensional quantitative phase imaging via tomographic deconvolution phase microscopy,” Appl. Opt. 54, 9213 (2015). https://doi.org/10.1364/ao.54.009213The Hermitian symmetry of intensity imaging demands two wave vectors with equal and opposite projections on the camera [see Fig. 1(a)], and the sampling surfaces defined by the delta functions in Eq. (3) represent the collection of pairs of source field vectors equidistant from each other in the direction of the scattering object vector q⃗. By analogy, the sampling surface of the spectral density itself in Eq. (1), when interacting with the object, corresponds to the individual field vectors equidistant from the scattering vector (i.e. of the same wavenumber), giving rise to the Ewald sphere. Due to the coordinate change in the argument of the delta function from Eq. (1), the valid sampling surface in Eq. (3) changes in form from a sphere to a plane, so we refer to it as the Ewald plane. This plane defines the set of valid field interactions, each of which contributes incoherently to the total intensity of a given spatial frequency vector q⃗ in the 3D image space, regardless of wavelength. The Ewald planes select these fields for a given spatial frequency on the camera, and the resulting component in the linear 3D transfer function corresponds to the sum total of all incident light source fields that contribute to said spatial frequency. The amplitude signal is given by the sum of the two sampling Ewald planes selected with the integration over the delta functions in Eq. (3), while the phase signal is given by the difference.III. COMPRESSION OF COMPUTATION
Section:
ChooseTop of pageABSTRACTI. INTRODUCTIONII. PARTIALLY-COHERENT BR...III. COMPRESSION OF COMPU... <<IV. RESULTS AND IMPLICATI...V. CONCLUSIONSUPPLEMENTARY MATERIALREFERENCESIn order to evaluate Eq. (3) quickly and efficiently, it is necessary to first determine the boundaries of the intersections of the Ewald planes with the source and pupil distributions so that a parsimonious sum may be taken over only the pre-determined non-zero regions. The angular dependence on acceptance of the source and pupil imposes a cone surface delineating their boundaries, whose apical angle is determined by their respective entrance NAs. The source and pupil cones may be centered at the origin. In addition, minimum and maximum temporal frequencies can be defined for a truncated effective spectrum, whose boundaries are given as spheres centered at the origin. The described geometry is further explored in the supplementary material, Sec. IV A.The intersections of the Ewald planes with the source and pupil boundaries form hyperbolas (or ellipses in some cases), and their intersections with the frequency limits form circles. It is, therefore, possible, using the methods detailed in the supplementary material, Sec. IV B, to analytically determine the boundaries of integration in the frame of reference of the intersecting plane. In polar coordinates on the surface, the radial extent of the integration will be determined by the frequency limits, and the angular extent will be determined by the pupil hyperbola on the left-hand side of the plane. Figure 2 demonstrates an example of the boundaries projected onto an intersecting plane, with the shaded region indicating the region of integration.The value of the function to be integrated is determined by the product of the source spatial spectrum, temporal spectrum, and the appropriate Jacobian of transformation (see the supplementary material, Sec. V). The computation can be most conveniently performed in polar coordinates on the 2D sampling plane. The radial limits of integration are determined by the temporal spectrum, given trivially asρminmax=R(νminmax)2−q2/2.(4)The hyperbolic limits can be found with algebraic geometry and then the interior region by acquiring the logical intersection of the two hyperbolic regions. This analysis leads to a numerical formula for the surfaces of intersection defined by the angle Φ as shown in Fig. 2 (for details, see the supplementary material, Sec. IV B),cosΦo/i=−q2qxρρ2+q24(1+αP2)−12+(1+αS2)−12+ρ2+q24(1+αP2)−12−(1+αS2)−12∓q3,(5)where the subscripts o and i are used to distinguish the Ewald surface of integration, referring to the surface with the outward-facing and inward-facing normal vectors, which determine if the shift was in the positive or negative q⃗2 direction. Furthermore, αS and αP represent the source and pupil NA, respectively, and for notational convenience q=|q⃗| and qx=q12+q22.With the limits of integration determined by Eq. (5), the formula for computing the OTF [Eq. (3)] can be condensed toT(q⃗)α/ϕ=∫Φo2π−Φo∫ρminρmax|u⃗(so)|S(u⃗(so))|P(u⃗(si))|2qd2so±∫Φi2π−Φi∫ρminρmax|u⃗(si)|S(u⃗(si))|P(u⃗(so))|2qd2si,(6)where Φo/i refers to the corresponding sectioning plane, with either the outward or inward normal, and is computed with the inverse cosine of Eq. (5). In addition, the general 2D vectors so and si are being used for brevity represent polar coordinates on the respective surfaces, according to d2s = ρ dρ dΦ.In summary, for each non-zero point in the image spatial frequency spectrum, the source spatiotemporal Fourier space intensity distribution is sampled over a subset of locations that represent the wavelengths and angles of incidence that may contribute incoherently to the given spatial frequency on the camera. As shown by the theoretical treatment presented here and in the supplementary material, this subset falls on a plane intersecting the source and pupil Fourier space distributions, which are themselves described by cones due to the angular selection of a finite acceptance aperture. The extent of possible contributing plane waves can, therefore, be determined beforehand with the analytic geometry of conic sections, treated more thoroughly in the supplementary material. Once determined, the Fourier space of the source may be sampled over the valid region at a rate determined by the user, which depends on the smoothness of the source distribution. This decouples computation cost from 3D image resolution for all but the bare minimum needed to produce a 3D OTF, and re-couples it with the sampling rate over a smoothly varying function, which in practice greatly reduces computation time for physical sources and large datasets.This is the main result of this letter and describes a numerical and analytical procedure to produce a non-paraxial 3D OTF for a transmission or epi-mode imaging system for an arbitrary source, regardless of bandwidth, symmetry, or the state of temporal or spatial coherence, efficiently and accurately. We gain this advantage by analytically establishing the limits of integration that permit us to consider only non-zero values in the source distribution and by centering the evaluation for a single output coordinate of q⃗, again, considering only non-zero outputs. Furthermore, there is no reliance on 3D Fourier transforms or convolutions, as the calculation is pared down to its bare essentials. We, therefore, improve upon the numerical procedure that was previously conceived as a direct convolutional implementation of a version Eq. (3) for monochromatic sources only.2626. J. Li, Q. Chen, J. Sun, J. Zhang, J. Ding, and C. Zuo, “Three-dimensional tomographic microscopy technique with multi-frequency combination with partially coherent illuminations,” Biomed. Opt. Express 9, 2526–2542 (2018). https://doi.org/10.1364/BOE.9.002526IV. RESULTS AND IMPLICATIONS
Section:
ChooseTop of pageABSTRACTI. INTRODUCTIONII. PARTIALLY-COHERENT BR...III. COMPRESSION OF COMPU...IV. RESULTS AND IMPLICATI... <<V. CONCLUSIONSUPPLEMENTARY MATERIALREFERENCESPreviously, producing a 3D partially coherent OTF for arbitrary angular distributions would involve discretizing a source angular spectrum and centering computation on a single incident source direction and convolving spherical caps representing the pupil and source distributions in 3D Fourier space.
Comments (0)