Optical Neural Networks Close the Gap on Their Electronic Counterparts
Complete the form below to unlock access to ALL audio articles.
A new paper in Advanced Photonics, an open-access journal co-published by SPIE, the international society for optics and photonics, and Chinese Laser Press (CLP), demonstrates distinct improvements to the inference and generalization performance of diffractive optical neural networks.
One of the key improvements discussed in the paper, "Class-specific differential detection in diffractive optical neural networks improves inference accuracy," incorporates a differential detection scheme combined with a set of parallel-operating diffractive optical networks, where each individual network of this set is specialized to specifically recognize a sub-group of object classes.
According to SPIE Fellow Aydogan Ozcan of the University of California, Los Angeles, and one of the paper's authors, these results "provide a major advancement to bring optical neural network-based low-power and low-latency solutions for various machine-learning applications."
This latest research is a significant advance to Ozcan's optical machine-learning framework: the finessing of this technology is especially significant for recognizing target objects more quickly and with significantly less power than standard computer-based machine learning systems. Ultimately, it may provide major advantages for autonomous vehicles, robotics and various defense-related applications, among others.
These latest systematic advances, in diffractive optical network designs in particular, have the potential to advance the development of next-generation, task-specific, and intelligent computational camera systems.
Reference: Li, J., Mengu, D., Luo, Y., Rivenson, Y., & Ozcan, A. (2019). Class-specific differential detection in diffractive optical neural networks improves inference accuracy. Advanced Photonics, 1(4), 046001. https://doi.org/10.1117/1.AP.1.4.046001
This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source.