Photonic computing, the future asset of AI

Among the drawbacks of generative artificial intelligence (AI) systems, their high power consumption is worrisome. A December 2024 report from the Lawrence Berkeley National Laboratory estimates that the development of these technologies could increase the share of data centers in US electricity consumption from 6.7% to 12% in 2028, compared to 4.4% in 2023. That's approximately 300 to 600 terawatt-hours of consumption. Between 2014 and 2019, this share hadn't changed much, at less than 2%.
The reason is the enormous size of these programs and the billions of operations that need to be performed on electronic chips and graphics cards. While efforts are being made to increase the efficiency of these calculations, researchers hope for greater gains by radically changing the way they are performed.
Two competing companies showcase their promising performance in this area in the April 9 issue of Nature . Goodbye electrons, make way for photons, grains of light. And hello to speed gains and lower power consumption, because photons travel quickly and without heating the material. All this on chips compatible with the current semiconductor industry and similar to those used in electronics.
You have 76.19% of this article left to read. The rest is reserved for subscribers.
Le Monde