AI data centers today consume as much electricity as entire countries. Researchers at the University of Cambridge have now developed a new chip material that they say could reduce AI hardware energy consumption by up to 70 percent. Their memristor, made from modified hafnium oxide, switches with currents roughly one million times lower than conventional components of this type. The findings were published in the journal Science Advances.
The fundamental problem with today's AI hardware
Conventional computer chips separate memory and processor. Every calculation requires data transfers between the two, which costs energy and takes time. The human brain works differently: neurons process and store information in the same place, with energy consumption orders of magnitude lower than today's chips.
Memristors aim to replicate this architecture. They can change their electrical resistance, enabling them to handle both memory and computational operations in a single component. The problem with existing memristors: they required comparatively high switching currents and exhibited unstable resistance states. The Cambridge team describes how they solved this with a new material.
A new material with p-n junctions
The breakthrough lies in the composition: the researchers added small amounts of strontium and titanium to the established semiconductor material hafnium oxide and used a two-step growth process. This created p-n junctions at the layer boundaries. Instead of relying on forming and breaking electrical filaments, the new component changes its resistance by adjusting the energy barrier at these junctions.
The result: switching currents roughly one million times lower than comparable oxide memristors. At the same time, hundreds of stable conductance levels can be set, which is fundamentally important for analog in-memory computing. The component proved stable over tens of thousands of switching cycles and demonstrated biologically inspired learning patterns, including spike-timing dependent plasticity, which strengthens and weakens connections depending on the timing of signals, as occurs in the human brain.
70 percent less power and a real hurdle
Concretely: an AI chip based on this architecture could consume up to 70 percent less electricity than current hardware. Given that AI data centers are projected to consume around 800 terawatt-hours annually by 2026, roughly 3 percent of global electricity according to the International Energy Agency, this would significantly ease pressure on power grids and carbon footprints.
There is, however, a central technical obstacle: the new material's manufacturing process requires temperatures of around 700 degrees Celsius. This is higher than what standardized semiconductor production typically tolerates. Current production lines at TSMC or Samsung are designed for lower process temperatures, because higher heat can damage other layers already deposited. Without a modified manufacturing process, the material cannot go into mass production in the near term. The researchers address this openly in the publication, noting that integration into existing production lines requires further development work.
A possible path, but not a quick one
Neuromorphic chips are not a new concept. IBM has had its "TrueNorth" and Intel its "Loihi" in research for years. The European Human Brain Project has invested billions in similar approaches. To date, neuromorphic chips have barely moved beyond research laboratories, because their performance for real AI tasks still falls short of GPU architectures.
The Cambridge material is nonetheless a concrete step forward: it demonstrates that the critical physical properties, low switching currents and stable conductance levels, can be combined in a manufacturing-relevant material. The Cambridge team says it is working on processes that would allow lower manufacturing temperatures. Before such a chip runs in an AI data center, however, years of further development likely remain.