Researchers have developed a new kind of nanoelectronic device that could dramatically cut the energy consumed by artificial intelligence hardware by mimicking the human brain.
The researchers, led by the University of Cambridge, developed a form of hafnium oxide that acts as a highly stable, low‑energy ‘memristor’ — a component designed to mimic the efficient way neurons are connected in the brain.
Abstract
The escalating energy consumption of existing artificial intelligence hardware has become a serious global issue that demands immediate action. Neuromorphic computing offers promises to drastically reduce this footprint. Here, we introduce multicomponent p-type Hf(Sr,Ti)O2 thin films for energy-efficient, resistive switching–based neuromorphic devices. We demonstrate interfacial memristors with ultralow switching currents (≤~10−8 A), exceptional cycle-to-cycle and device-to-device uniformities, and retention >105 s. They reveal hundreds of ultralow conductance levels with a modulation range of >50 (without reaching any saturation) and reproducibly satisfy unsupervised learning rules. This performance originates from incorporating a self-assembled p-n heterointerface between p-type Hf(Sr,Ti)O2 and n-type TiOxNy, resulting in a fully depleted space-charge layer asymmetrically extended into Hf(Sr,Ti)O2, a large built-in potential, and extremely low saturation current density under reverse bias. Ultralow conductance modulation is controlled by tuning p-n heterointerface’s energy-barrier height through electro-ionic charge migration. This materials-engineering strategy addresses energy consumption and variability in existing memristors, opening a pathway toward energy-efficient neuromorphic computing systems.
Yeah. I can believe that forces within the human brain could help AI reduce it’s power consumption.
Step 1) Turn off AI.
Step 2) There is no step 2.
At least, that’s what my brain thought.
I was half-expecting that new material to be hubris.
Hubrinium.

could dramatically cut the energy consumed by artificial intelligence hardware
Decreasing the cost of using a resource almost always results in more use of that resource.
Laboratory tests showed the devices could reliably endure tens of thousands of switching cycles
That’s not very many when GPUs perform trillions of operations per second.
It’d probably be far more appropriate for an analogue system where it isn’t being switched but it’s rather what the model is burned onto
This seems like such a glaringly-obvious solution to lower inference cost that surely there must be some fundamental flaw in it… otherwise all of the big AI firms would be doing it, right?
Right…?
Takes a while for the technology to become available in ASICs, we still don’t have purpose designed silicon for AI. We’re using repurposed GPUs with tensor cores scaled up still for pretty much all AI workloads
I feel like memristors have been a buzzword for like 20 years now… am I wrong? Is it for real this time?
Cool tech but very much in infancy. Good for them on getting a hype article but this in no way affects anything about computing tofay.
AI boosters are no longer allowed to explain what’s good about AI using the future tense. You can no longer say “it will,” “could,” “might,” “likely,” “possible,” “estimated,” “promise,” or any other term that reviews today’s capabilities in the language of the future.
Shooting a large rocket full of tech bros directly in to the sun will have a similar effect.
Shooting a rocket directly into the sun would waste as much energy as current AI data centers, because it would have to shed all the earth’s momentum.
Better to just use a volcano.
We can sell tickets for people to take turns pushing. A dollar per participant sounds reasonable to me.
Oh great, now we’re gonna start polluting the sun? Can’t we contain our garbage to a single planet?
All of the things you’d be polluting the sun with are already there.










