General neural network approach to compressive feature extraction

dc.citation.firstpageG217
dc.citation.issueNumber25
dc.citation.journalTitleApplied Optics
dc.citation.lastpageG223
dc.citation.volumeNumber60
dc.contributor.authorGiljum, Anthony
dc.contributor.authorGiljum, Anthony
dc.contributor.authorLiu, Weidi
dc.contributor.authorLiu, Weidi
dc.contributor.authorLi, Le
dc.contributor.authorWeber, Reed
dc.contributor.authorKelly, Kevin F.
dc.date.accessioned2021-08-30T19:39:38Z
dc.date.available2021-08-30T19:39:38Z
dc.date.issued2021
dc.description.abstractComputer vision with a single-pixel camera is currently limited by a trade-off between reconstruction capability and image classification accuracy. If random projections are used to sample the scene, then reconstruction is possible but classification accuracy suffers, especially in cases with significant background signal. If data-driven projections are used, then classification accuracy improves and the effect of the background is diminished, but image recovery is not possible. Here, we employ a shallow neural network to nonlinearly convert from measurements acquired with random patterns to measurements acquired with data-driven patterns. The results demonstrate that this improves classification accuracy while still allowing for full reconstruction.
dc.identifier.citationGiljum, Anthony, Giljum, Anthony, Liu, Weidi, et al.. "General neural network approach to compressive feature extraction." <i>Applied Optics,</i> 60, no. 25 (2021) Optical Society of America: G217-G223. https://doi.org/10.1364/AO.427383.
dc.identifier.doihttps://doi.org/10.1364/AO.427383
dc.identifier.urihttps://hdl.handle.net/1911/111344
dc.language.isoeng
dc.publisherOptical Society of America
dc.titleGeneral neural network approach to compressive feature extraction
dc.typeJournal article
dc.type.dcmiText
dc.type.publicationpublisher version
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
kelly21applopt_s2c2.pdf
Size:
7.53 MB
Format:
Adobe Portable Document Format
Description: