What if computers could process information at the speed of light, consuming a fraction of today’s energy and smashing through limits that hold back even the world’s fastest supercomputers? A groundbreaking experiment using ultrathin glass fibers and laser pulses now brings that future one step closer—and it’s not science fiction.
Key Points at a Glance
- Researchers have used laser pulses in optical fibers to mimic how artificial intelligence processes data—thousands of times faster than electronics.
- The experiment, inspired by neural networks, harnesses nonlinear interactions of light and glass, not just conventional silicon chips.
- This “Extreme Learning Machine” reached 91% accuracy in classifying handwritten digits—in less than one trillionth of a second.
- Optical AI could revolutionize energy efficiency and processing speed in next-generation computers and real-world applications.
Every year, the appetite for faster, more powerful computers grows. Artificial intelligence, powering everything from language models to self-driving cars, demands ever more data and energy. But traditional electronics are hitting a wall—too much heat, too much power, and physical limits on how quickly electrons can move. What if, instead of pushing electrons, we could use light itself?
That’s exactly what an international team of researchers, led by Dr. Mathilde Hary from Tampere University and Dr. Andrei Ermolaev from Université Marie et Louis Pasteur, has just demonstrated. In a feat of experimental physics and applied AI, they have shown that optical fibers—those hair-thin glass strands that already move information across the internet—can be turned into “brains” for ultrafast artificial intelligence. Their work, inspired by the architecture of neural networks, is helping to usher in an era of hybrid optical-electronic computers that learn and decide at the speed of light.
The secret lies in how light behaves inside the glass. The team used femtosecond laser pulses, bursts of light so short they last only a billionth of a millisecond—thousands of times faster than a camera flash. These pulses, filled with a rainbow of wavelengths, were sent into optical fibers with carefully timed delays, each delay encoding a piece of information, like the pixels of a handwritten number. When the light pulses interacted with the glass, their colors and intensities changed in complex, nonlinear ways—amplifying tiny differences and producing new spectral patterns at the output.
Here’s where things get revolutionary: these patterns, shaped by the interplay of fiber length, pulse power, and light dispersion, actually contain enough information to allow a machine to recognize and classify digits—much like the neural networks behind modern handwriting recognition. The result? An “Extreme Learning Machine” operating with blazing speed and astonishing efficiency, able to classify images from the classic MNIST dataset with over 91% accuracy in under one picosecond. That’s a processing time so fast, a standard computer couldn’t even register it.
But speed wasn’t the only breakthrough. The research revealed that the optimal performance didn’t come from simply cranking up the power or making the system more complex. Instead, it was about striking a balance: tuning the fiber’s length, adjusting how different wavelengths spread out (dispersion), and precisely controlling the light’s structure at the input. This discovery challenges the usual “more is better” mindset of digital computing and hints at the subtlety and potential of photonic systems.
Why does this matter? As AI models grow larger and more power-hungry, the world is urgently searching for new ways to process information without breaking the bank—or the planet’s energy budget. Optical AI systems promise to slash energy use while delivering speeds electronics can never match. The work by Hary, Ermolaev, and their teams points the way toward on-chip photonic computers that could one day run everything from real-time signal processing and environmental sensors to medical diagnostics and ultrafast decision-making in robotics.
What’s more, their models show how factors like quantum noise—tiny, random fluctuations at the quantum level—can influence performance, offering a roadmap for designing the next wave of hybrid optical-electronic AI. By combining the raw power of light with the versatility of neural network algorithms, researchers are unlocking new possibilities for information processing that go beyond what silicon alone can offer.
Their work, funded by the Research Council of Finland, the French National Research Agency, and the European Research Council, is a testament to the power of collaboration between physics and computer science. It’s also a reminder that some of the most transformative technologies come not from brute force, but from understanding—and harnessing—the delicate interplay of nature’s forces.
The future these researchers envision is one where computers process information at light speed, consume vastly less energy, and can learn in real time. The days of waiting for complex AI models to crunch through data could soon be over. Instead, we may soon see ultrafast, energy-efficient AI—powered not by silicon chips, but by the magic of lasers and light.
Source: Tampere University
Enjoying our articles?
We don’t show ads — so you can focus entirely on the story, without pop-ups or distractions. We don’t do sponsored content either, because we want to stay objective and only write about what truly fascinates us. If you’d like to help us keep going — buy us a coffee. It’s a small gesture that means a lot. Click here – Thank You!