Subscribe to our newsletter and stay informed

Check out our list of top companies

Check out our carefully compiled lists of the most relevant and impactful companies within their fields.

Check out our list of top unicorns

Read and learn about the biggest companies that various countries have produced, how they made it, and what the future looks like for them.

Revolutionizing AI: How Light-Based Chips Could Transform Energy Consumption

Innovative light-based chips promise to curb AI's soaring energy demands, addressing urgent power consumption concerns
June 17, 2024

As the relentless march of artificial intelligence continues, its hunger for energy grows at an unsustainable pace. The International Energy Agency forecasts that by 2026, AI will consume ten times more power than in 2023, equating the energy usage of global data centers to that of Japan. Nick Harris, founder and CEO of Lightmatter, underscores the urgency, noting that AI's computing demands are doubling every three months, outpacing Moore’s Law. This exponential growth threatens to strain both companies and economies.

One promising solution lies in the realm of optical computing, where information is processed using photons—tiny packets of light—rather than traditional electrons. This shift could potentially address the skyrocketing energy demands of AI.

Natalia Berloff, a physicist at the University of Cambridge, highlights that optical computing is paving the way for breakthroughs in fields requiring high-speed and high-efficiency processing, particularly artificial intelligence.

The Promise of Photonic Computing

Optical signals offer significant advantages over electrical ones. They can carry more information, operate at higher frequencies, and perform more computing steps in less time with lower latency. Moreover, optical systems could run cooler and more efficiently, allowing more operations to take place simultaneously. Gordon Wetzstein, an electrical engineer at Stanford University, suggests that harnessing these benefits could unlock new possibilities.

Researchers have explored using light for AI since the 1980s, developing some of the earliest neural networks. Despite initial challenges, recent advancements have reignited interest in optical computing, particularly for tasks like matrix multiplication, essential in AI computations.

Advances in Optical Neural Networks

In 2017, a groundbreaking paper by MIT researchers detailed an optical neural network (ONN) capable of performing matrix multiplication using light. By encoding quantities into light beams and manipulating their phases, the system could efficiently process information. This innovation catalyzed renewed interest in ONNs, demonstrating their potential for AI applications.

Since then, progress has continued. MIT’s latest ONN, HITOP, aims to scale computation by optimizing time, space, and wavelength dimensions. This approach reduces energy costs by spreading the load across multiple calculations. While still lagging behind leading electronic chips in raw processing power, HITOP's efficiency shows promise.

Other research teams are exploring flexible optical computing systems. The University of Pennsylvania recently introduced a reconfigurable ONN that uses lasers to map calculations dynamically. This flexibility allows for on-the-fly adjustments, a significant departure from the fixed nature of most chip-based systems.

The Road Ahead

Despite significant advancements, optical computing is still in its early stages compared to electronic systems. Current photonic systems handle smaller models and workloads, and scaling these systems to compete with giants like Nvidia remains a challenge. Bhavin Shastri of Queen’s University notes that comparing photonic and electronic systems directly is complex due to differences in energy usage and operational efficiencies.

However, specialized applications for ONNs are emerging. Shastri's team developed an ONN that sorts wireless transmissions in real-time, outperforming electronic systems in speed and power efficiency. These niche applications could pave the way for broader adoption of optical computing.

The ultimate goal remains ambitious: developing an optical neural network that surpasses electronic systems for general AI use. Peter McMahon from Cornell University envisions a future where large-scale optical systems make AI models significantly more efficient. Though this vision may be a decade away, the potential benefits are immense.

As the field progresses, the integration of light-based chips could dramatically reduce the energy footprint of AI, making it more sustainable and efficient. The journey toward optical computing represents a bold step toward meeting the ever-growing demands of artificial intelligence while addressing critical energy challenges.

More about:  |
chevron-down linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram