What Is Next for LLMs? Next-Generation AI Computing Hardware Using Photonic Chips

Published: 06 Oct 2025, Last Modified: 05 Feb 2026NanophotonicsEveryoneCC BY-NC-ND 4.0
Abstract: Large language models (LLMs) are rapidly pushing the limits of contemporary computing hardware. For example, training GPT-3 has been estimated to con- sume around 1300 MWh of electricity, and projections suggest future models may require city-scale (gigawatt) power budgets. These demands motivate explo- ration of computing paradigms beyond conventional von Neumann architectures. This review surveys emerging photonic hardware optimized for next-generation generative AI computing. We discuss integrated photonic neural network architec- tures (e.g. Mach–Zehnder interferometer meshes, lasers, wavelength-multiplexed microring-resonators) that perform ultrafast matrix operations. We also examine promising alternative neuromorphic devices, including spiking neural network circuits and hybrid spintronic-photonic synapses, which combine memory and processing. The integration of two-dimensional materials (graphene, TMDCs) into silicon photonic platforms is reviewed for tunable modulators and on-chip synaptic elements. Transformer-based LLM architectures (self-attention and feed-forward layers) are analyzed in this context, identifying strategies and challenges for map- ping dynamic matrix multiplications onto these novel hardware substrates. We then dissected the mechanisms of mainstream LLMs, such as chatGPT, DeepSeek, and Llama, highlighting their architectural similarities and differences. We synthesize state-of-the-art components, algorithms, and integration methods, highlighting key advances and open issues in scaling such systems to mega-sized LLM models. We find that photonic computing systems could potentially surpass electronic pro- cessors by orders of magnitude in throughput and energy efficiency, but require breakthroughs in memory especially for long-context windows and long token sequences and in storage of ultra-large datasets. This survey provides a comprehen- sive roadmap for AI hardware development, emphasizing the role of cutting-edge photonic components and technologies in supporting future LLMs.
Loading