Repository logo
English
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Yкраї́нська
  • Log In
    or
    New user? Click here to register.Have you forgotten your password?
Repository logo
  • Communities & Collections
  • All of R-3
English
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Yкраї́нська
  • Log In
    or
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Wu, Yicheng"

Now showing 1 - 9 of 9
Results Per Page
Sort Options
  • Loading...
    Thumbnail Image
    Item
    3D sensing by optics and algorithm co-design
    (2021-04-29) Wu, Yicheng; Veeraraghavan, Ashok
    3D sensing provides the full spatial context of the world, which is important for applications such as augmented reality, virtual reality, and autonomous driving. Unfortunately, conventional cameras only capture a 2D projection of a 3D scene, while depth information is lost. In my research, I propose 3D sensors by jointly designing optics and algorithms. The key idea is to optically encode depth on the sensor measurement, and digitally decode depth using computational solvers. This allows us to recover depth accurately and robustly. In the first part of my thesis, I explore depth estimation using wavefront sensing, which is useful for scientific systems. Depth is encoded in the phase of a wavefront. I build a novel wavefront imaging sensor with high resolution (a.k.a. WISH), using a programmable spatial light modulator (SLM) and a phase retrieval algorithm. WISH offers fine phase estimation with significantly better spatial resolution as compared to currently available wavefront sensors. However, WISH only provides a micron-scale depth range limited by the optical wavelength. To work for macroscopic objects, I propose WISHED, which increases the depth range by more than 1,000x. It is achieved based on the idea of wavelength diversity by combining the estimated phase at two close optical wavelengths. WISHED is capable of measuring transparent, translucent, and opaque 3D objects with smooth and rough surfaces. In the second part of my thesis, I study depth recovery with 3D point spread function (PSF) engineering, which has wide applications for commercial devices. Depth is encoded into the blurriness of the image. To increase the PSF variation over depth, I propose to insert a phase mask on the lens aperture. Then, a deep learning-based algorithm is used to predict depth from the sensor image. To optimize the entire system, I developed an end-to-end optimization pipeline. The key insight is to incorporate the learning of hardware parameters by building a differentiable physics simulator that maps the scene to a sensor image. This simulator represents the optical layer of the deep neural network, followed by digital layers that represent the computational algorithm. This network is trained by datasets with a task-specific loss and outputs optimal parameters for both hardware and algorithms. Based on this idea, I develop two prototypes: PhaseCam3D - a passive single view depth sensor, and FreeCam3D - a structured light framework for scene depth estimation and localization with freely moving cameras. In summary, this thesis provides two 3D-sensing solutions with the idea of optical/digital co-design. I envision different modalities of 3D imaging to be widely adopted in the near future, enabling improved capabilities in many existing applications while revealing entirely new, hitherto unexplored application areas.
  • Loading...
    Thumbnail Image
    Item
    Deep learning extended depth-of-field microscope for fast and slide-free histology
    (PNAS, 2020) Jin, Lingbo; Tang, Yubo; Wu, Yicheng; Coole, Jackson B.; Tan, Melody T.; Zhao, Xuan; Badaoui, Hawraa; Robinson, Jacob T.; Williams, Michelle D.; Gillenwater, Ann M.; Richards-Kortum, Rebecca R.; Veeraraghavan, Ashok; Bioengineering; Electrical and Computer Engineering
    Microscopic evaluation of resected tissue plays a central role in the surgical management of cancer. Because optical microscopes have a limited depth-of-field (DOF), resected tissue is either frozen or preserved with chemical fixatives, sliced into thin sections placed on microscope slides, stained, and imaged to determine whether surgical margins are free of tumor cells—a costly and time- and labor-intensive procedure. Here, we introduce a deep-learning extended DOF (DeepDOF) microscope to quickly image large areas of freshly resected tissue to provide histologic-quality images of surgical margins without physical sectioning. The DeepDOF microscope consists of a conventional fluorescence microscope with the simple addition of an inexpensive (less than $10) phase mask inserted in the pupil plane to encode the light field and enhance the depth-invariance of the point-spread function. When used with a jointly optimized image-reconstruction algorithm, diffraction-limited optical performance to resolve subcellular features can be maintained while significantly extending the DOF (200 µm). Data from resected oral surgical specimens show that the DeepDOF microscope can consistently visualize nuclear morphology and other important diagnostic features across highly irregular resected tissue surfaces without serial refocusing. With the capability to quickly scan intact samples with subcellular detail, the DeepDOF microscope can improve tissue sampling during intraoperative tumor-margin assessment, while offering an affordable tool to provide histological information from resected tissue specimens in resource-limited settings.
  • Loading...
    Thumbnail Image
    Item
    Passive and single-viewpoint 3D imaging system
    (2023-06-13) Wu, Yicheng; Boominathan, Vivek; Chen, Huaijin; Sankaranarayanan, Aswin C.; Veeraraghavan, Ashok; William Marsh Rice University; Carnegie Mellon University; United States Patent and Trademark Office
    A method for a passive single-viewpoint 3D imaging system comprises capturing an image from a camera having one or more phase masks. The method further includes using a reconstruction algorithm, for estimation of a 3D or depth image.
  • Loading...
    Thumbnail Image
    Item
    Passive and single-viewpoint 3D imaging system
    (2024-08-27) Wu, Yicheng; Boominathan, Vivek; Chen, Huaijin; Sankaranarayanan, Aswin C.; Veeraraghavan, Ashok; Rice University; United States Patent and Trademark Office
    A method for a passive single-viewpoint 3D imaging system comprises capturing an image from a camera having one or more phase masks. The method further includes using a reconstruction algorithm, for estimation of a 3D or depth image.
  • Loading...
    Thumbnail Image
    Item
    Phase Retrieval Methods to Improve Spatial Resolution of Long-Distance Imaging
    (2018-10-17) Wu, Yicheng; Veeraraghavan, Ashok
    For long-distance imaging, spatial resolution is mainly limited by the small aperture size of the lens. To improve the performance, huge and high-precision lens systems are normally built, which are always bulky and expensive. In this thesis, I propose two methods to achieve high spatial resolution with light and low-cost lenses by phase retrieval algorithms. In the first work, I present a reflective Fourier Ptychography system using a small-size lens. Multiple low-resolution images are captured from different locations, and then fused together to create a big synthetic aperture computationally. I demonstrate the first working prototype that can achieve six-times spatial-resolution improvements over any single captured image for various diffuse objects. In the second work, I propose to correct arbitrary optical aberrations in a large but low-cost and low-quality lens by wavefront coding. A spatial light modulator (SLM) is placed in front of the sensor to randomly modulate phase distribution of the incident light field. Based on multiple SLM patterns and corresponding images on the sensor, the object field can be retrieved successfully. I build an experimental system to show a 2-inch Fresnel lens can reconstruct diffraction-limited images.
  • Loading...
    Thumbnail Image
    Item
    SAVI: Synthetic apertures for long-range, subdiffraction-limited visible imaging using Fourier ptychography
    (AAAS, 2017) Holloway, Jason; Wu, Yicheng; Sharma, Manoj K.; Cossairt, Oliver; Veeraraghavan, Ashok
    Synthetic aperture radar is a well-known technique for improving resolution in radio imaging. Extending these synthetic aperture techniques to the visible light domain is not straightforward because optical receivers cannot measure phase information. We propose to use macroscopic Fourier ptychography (FP) as a practical means of creating a synthetic aperture for visible imaging to achieve subdiffraction-limited resolution. We demonstrate the first working prototype for macroscopic FP in a reflection imaging geometry that is capable of imaging optically rough objects. In addition, a novel image space denoising regularization is introduced during phase retrieval to reduce the effects of speckle and improve perceptual quality of the recovered high-resolution image. Our approach is validated experimentally where the resolution of various diffuse objects is improved sixfold.
  • Loading...
    Thumbnail Image
    Item
    Synthetic apertures for long-range, sub-diffraction limited visible imaging using fourier ptychography
    (2020-06-23) Cossairt, Oliver Strider; Holloway, Jason; Veeraraghavan, Ashok; Sharma, Manoj Kumar; Wu, Yicheng; Rice University; Northwestern University; United States Patent and Trademark Office
    A method for imaging objects includes illuminating an object with a light source of an imaging device, and receiving an illumination field reflected by the object. An aperture field that intercepts a pupil of the imaging device is an optical propagation of the illumination field at an aperture plane. The method includes receiving a portion of the aperture field onto a camera sensor, and receiving a sensor field of optical intensity. The method also includes iteratively centering the camera focus along the Fourier plane at different locations to produce a series of sensor fields and stitching together the sensor fields in the Fourier domain to generate an image. The method also includes determining a plurality of phase information for each sensor field in the series of sensor fields, applying the plurality of phase information to the image, receiving a plurality of illumination fields reflected by the object, and denoising the intensity of plurality of illumination fields using Fourier ptychography.
  • Loading...
    Thumbnail Image
    Item
    WISH: wavefront imaging sensor with high resolution
    (Springer Nature, 2019) Wu, Yicheng; Sharma, Manoj Kumar; Veeraraghavan, Ashok
    Wavefront sensing is the simultaneous measurement of the amplitude and phase of an incoming optical field. Traditional wavefront sensors such as Shack-Hartmann wavefront sensor (SHWFS) suffer from a fundamental tradeoff between spatial resolution and phase estimation and consequently can only achieve a resolution of a few thousand pixels. To break this tradeoff, we present a novel computational-imaging-based technique, namely, the Wavefront Imaging Sensor with High resolution (WISH). We replace the microlens array in SHWFS with a spatial light modulator (SLM) and use a computational phase-retrieval algorithm to recover the incident wavefront. This wavefront sensor can measure highly varying optical fields at more than 10-megapixel resolution with the fine phase estimation. To the best of our knowledge, this resolution is an order of magnitude higher than the current noninterferometric wavefront sensors. To demonstrate the capability of WISH, we present three applications, which cover a wide range of spatial scales. First, we produce the diffraction-limited reconstruction for long-distance imaging by combining WISH with a large-aperture, low-quality Fresnel lens. Second, we show the recovery of high-resolution images of objects that are obscured by scattering. Third, we show that WISH can be used as a microscope without an objective lens. Our study suggests that the designing principle of WISH, which combines optical modulators and computational algorithms to sense high-resolution optical fields, enables improved capabilities in many existing applications while revealing entirely new, hitherto unexplored application areas.
  • Loading...
    Thumbnail Image
    Item
    WISH: wavefront imaging sensor with high resolution
    (Springer Nature, 2019) Wu, Yicheng; Sharma, Manoj Kumar; Veeraraghavan, Ashok
    Wavefront sensing is the simultaneous measurement of the amplitude and phase of an incoming optical field. Traditional wavefront sensors such as Shack-Hartmann wavefront sensor (SHWFS) suffer from a fundamental tradeoff between spatial resolution and phase estimation and consequently can only achieve a resolution of a few thousand pixels. To break this tradeoff, we present a novel computational-imaging-based technique, namely, the Wavefront Imaging Sensor with High resolution (WISH). We replace the microlens array in SHWFS with a spatial light modulator (SLM) and use a computational phase-retrieval algorithm to recover the incident wavefront. This wavefront sensor can measure highly varying optical fields at more than 10-megapixel resolution with the fine phase estimation. To the best of our knowledge, this resolution is an order of magnitude higher than the current noninterferometric wavefront sensors. To demonstrate the capability of WISH, we present three applications, which cover a wide range of spatial scales. First, we produce the diffraction-limited reconstruction for long-distance imaging by combining WISH with a large-aperture, low-quality Fresnel lens. Second, we show the recovery of high-resolution images of objects that are obscured by scattering. Third, we show that WISH can be used as a microscope without an objective lens. Our study suggests that the designing principle of WISH, which combines optical modulators and computational algorithms to sense high-resolution optical fields, enables improved capabilities in many existing applications while revealing entirely new, hitherto unexplored application areas.
  • About R-3
  • Report a Digital Accessibility Issue
  • Request Accessible Formats
  • Fondren Library
  • Contact Us
  • FAQ
  • Privacy Notice
  • R-3 Policies

Physical Address:

6100 Main Street, Houston, Texas 77005

Mailing Address:

MS-44, P.O.BOX 1892, Houston, Texas 77251-1892