Fast acquisition of depth information is crucial for accurate 3D tracking of moving objects. Snapshot depth sensing can be achieved by wavefront coding, in which the point-spread function (PSF) is engineered to vary distinctively with scene depth by altering the detection optics. Here we employ multi-channel wavefront coding, and demonstrate it on 3D localization microscopy in densely labelled live cells, exhibiting significantly improved results over a single-channel optical system.
We implement a neural network to classify single-particle trajectories by diffusion types. Furthermore, we demonstrate the applicability of our network architecture for estimating the Hurst exponent for fractional Brownian motion and the diffusion coefficient for Brownian motion on both simulated and experimental data. On experimental data, both net and traditional analysis converge to similar values, with the net requiring only half the number of trajectories to achieve the same confidence interval.
Localization microscopy is an imaging technique in which the positions of individual point emitters are precisely determined from their images. Localization in 3D can be performed by modifying the image that a point-source creates on the camera, namely, the point-spread function (PSF), using additional optical elements. Here, we present two applications of CNNs in dense 3D localization microscopy: Learning an efficient 3D localization CNN for a given PSF entirely in Silicon and Learning an optimized PSF for high density localization via end-to-end optimization.