Blog

Jan 25, 2024

Programming light propagation creates highly efficient neural networks

Posted by in categories: information science, robotics/AI, space

Current artificial intelligence models utilize billions of trainable parameters to achieve challenging tasks. However, this large number of parameters comes with a hefty cost. Training and deploying these huge models require immense memory space and computing capability that can only be provided by hangar-sized data centers in processes that consume energy equivalent to the electricity needs of midsized cities.

The is presently making efforts to rethink both the related computing hardware and the machine learning algorithms to sustainably keep the development of at its current pace. Optical implementation of neural network architectures is a promising avenue because of the low power implementation of the connections between the units.

New research reported in Advanced Photonics combines light propagation inside multimode fibers with a small number of digitally programmable parameters and achieves the same performance on image classification tasks with fully digital systems with more than 100 times more programmable parameters. This streamlines the memory requirement and reduces the need for energy-intensive digital processes, while achieving the same level of accuracy in a variety of machine learning tasks.

Comments are closed.