Why Data Centers Are Becoming an Energy Infrastructure Problem

When digital growth collides with physical limits
When the IT reflex stops working
For decades, data centers were treated as a purely technological concern. They were planned like specialised real estate: secure locations, robust connectivity, reliable power supply. Energy was an operational input — important, but ultimately manageable through contracts and efficiency gains.
That assumption is no longer valid. Across Europe, new data center projects are delayed, downsized or rejected altogether — not because demand is lacking, but because grid capacity is. What used to be a cost consideration has become a hard physical constraint.
“The inability to get a grid connection has transformed from a future cost concern into a present-day hard stop on growth. By 2027, an estimated 40% of AI data centers will be power-constrained. Architecture is now a governance issue.”
International Energy Agency (analysis cited by Mark Williams, Senior Research Analyst)
At this point, data centers stop being an IT topic and become something else entirely: an energy infrastructure problem.
First principles: what if energy becomes the limiting factor?
From a first-principles perspective, the shift is profound. Classical IT logic assumes that compute is scarce and expensive, while energy is abundant and relatively predictable. In that world, optimisation focuses on faster processors, denser racks and higher utilisation.
But what happens when that relationship flips?
When compute becomes abundant but energy scarce, the entire optimisation logic collapses. The limiting factor is no longer processing power, but grid capacity, thermal dissipation and predictability of load. A data center, in this context, is best understood not as a computing facility, but as an energy conversion system: electricity in, information and heat out.
This is where the Energy Wall emerges — the point at which further digital growth collides with the physical limits of the energy system.
A key driver behind this wall is the changing cost structure of digital infrastructure. In traditional IT models, capital expenditure on hardware dominated total cost of ownership. In the AI era, operational expenditure — electricity and cooling — increasingly outweighs the servers themselves. Efficiency gains at chip level are quickly absorbed by growing demand, a classic example of the Jevons Paradox applied to data.
Heat, density and the thermodynamic ceiling
Nowhere is this tension more visible than at the rack level. Modern AI racks can draw up to 80–100 kilowatts each. By comparison, a typical household peaks at roughly 3 kilowatts. Concentrated into a small physical footprint, this energy density creates an unavoidable thermal problem.
Electronic interconnects play a crucial role here. Copper links inside servers and between components act as countless microscopic heaters. As bandwidth increases, resistance losses rise, forcing ever more aggressive cooling strategies. Cooling, in turn, consumes additional energy, reinforcing a feedback loop that pushes the system closer to its thermodynamic ceiling.
This is not merely an engineering inconvenience. It directly affects spatial planning, grid connections and environmental permits. At scale, heat becomes the silent antagonist of digital growth.
“In the Netherlands, data centers already consume roughly the same amount of electricity as two million households — close to five percent of total national usage. We cannot continue building more of the same; the physical headroom is gone.”
Centraal Bureau voor de Statistiek
National energy figures, 2025–2026
The Energy Wall made visible
The Energy Wall is not a future scenario; it is already shaping decisions. Projects stall not because technology is unavailable, but because energy cannot be delivered at the required scale or predictability. Efficiency improvements at component level still matter, but they no longer move the system needle.
The analogy is instructive: widening individual roads does little if the main junction is saturated. Beyond a certain point, only architectural change — not incremental optimisation — can restore flow.
This is where the discussion inevitably shifts from engineering to governance. When energy becomes the bottleneck, decisions about data center design, location and technology choice affect national infrastructure planning. Growth becomes a collective question rather than a private optimisation problem.
Where photonics changes the equation — without miracle claims
Photonics does not “solve” the energy problem. What it does is alter the underlying system dynamics.
By replacing electrical interconnects with optical ones, photonics dramatically reduces resistive losses at high bandwidths. Less heat is generated where data moves most intensely. This, in turn, changes cooling requirements, reduces hot spots and enables more stable thermal profiles across the facility.
“If we continue with current technologies, by 2030 we would theoretically need today’s entire global energy capacity just to keep the internet running. That is physically impossible. Integrated photonics is the only architectural path that allows growth without system collapse.”
Joost Verberk
Director of Product Line Management, EFFECT Photonics
Crucially, photonics also improves predictability. Optical systems exhibit less load-dependent thermal volatility, resulting in flatter energy demand curves. For grid operators, this matters as much as absolute consumption. Predictable neighbours are easier to integrate than spiky ones.
The grid perspective: from large consumer to system participant
From the viewpoint of net operators, unpredictability is the real enemy. Sudden peaks, rapid scaling and opaque load profiles complicate planning and investment. This is why the relationship between data centers and the grid is evolving.
“Growth is possible, but only if the sector remains in balance with the grid. Data centers must transform from passive large consumers into active, intelligent participants in the energy system. Grid integration is no longer optional.”
Stijn Grove
Director, Dutch Data Center Association
Photonics contributes here indirectly but materially. By reducing thermal volatility and enabling new architectural layouts, it makes energy demand more manageable at system level. The data center becomes a more predictable node — not an uncontrollable sink.
From optimisation to infrastructure design
The central lesson is straightforward, but uncomfortable. The future of digital infrastructure will not be decided by marginal efficiency gains alone. It will be shaped by how well architecture aligns with physical energy systems.
The relevant question, therefore, is no longer how to build faster data centers, but how to design digital infrastructure that fits within societal energy boundaries. When energy becomes the constraint, infrastructure choices turn into governance decisions.
Photonics does not offer an escape from these limits. It offers a way to design within them.
