Research.

Researchers have developed a new kind of nanoelectronic device that could dramatically cut the energy consumed by artificial intelligence hardware by mimicking the human brain.

The researchers, led by the University of Cambridge, developed a form of hafnium oxide that acts as a highly stable, low‑energy ‘memristor’ — a component designed to mimic the efficient way neurons are connected in the brain.

  • eleitl@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 hours ago

    Abstract

    The escalating energy consumption of existing artificial intelligence hardware has become a serious global issue that demands immediate action. Neuromorphic computing offers promises to drastically reduce this footprint. Here, we introduce multicomponent p-type Hf(Sr,Ti)O2 thin films for energy-efficient, resistive switching–based neuromorphic devices. We demonstrate interfacial memristors with ultralow switching currents (≤~10−8 A), exceptional cycle-to-cycle and device-to-device uniformities, and retention >105 s. They reveal hundreds of ultralow conductance levels with a modulation range of >50 (without reaching any saturation) and reproducibly satisfy unsupervised learning rules. This performance originates from incorporating a self-assembled p-n heterointerface between p-type Hf(Sr,Ti)O2 and n-type TiOxNy, resulting in a fully depleted space-charge layer asymmetrically extended into Hf(Sr,Ti)O2, a large built-in potential, and extremely low saturation current density under reverse bias. Ultralow conductance modulation is controlled by tuning p-n heterointerface’s energy-barrier height through electro-ionic charge migration. This materials-engineering strategy addresses energy consumption and variability in existing memristors, opening a pathway toward energy-efficient neuromorphic computing systems.

  • Lost_My_Mind
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    2
    ·
    19 hours ago

    Yeah. I can believe that forces within the human brain could help AI reduce it’s power consumption.

    Step 1) Turn off AI.

    Step 2) There is no step 2.

    At least, that’s what my brain thought.

  • veee@lemmy.ca
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    21 hours ago

    I was half-expecting that new material to be hubris.

  • Zak
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    21 hours ago

    could dramatically cut the energy consumed by artificial intelligence hardware

    Decreasing the cost of using a resource almost always results in more use of that resource.

    Laboratory tests showed the devices could reliably endure tens of thousands of switching cycles

    That’s not very many when GPUs perform trillions of operations per second.

    • ryannathans@aussie.zone
      link
      fedilink
      English
      arrow-up
      5
      ·
      20 hours ago

      It’d probably be far more appropriate for an analogue system where it isn’t being switched but it’s rather what the model is burned onto

      • very_well_lost
        link
        fedilink
        English
        arrow-up
        3
        ·
        19 hours ago

        This seems like such a glaringly-obvious solution to lower inference cost that surely there must be some fundamental flaw in it… otherwise all of the big AI firms would be doing it, right?

        Right…?

        • ryannathans@aussie.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          17 hours ago

          Takes a while for the technology to become available in ASICs, we still don’t have purpose designed silicon for AI. We’re using repurposed GPUs with tensor cores scaled up still for pretty much all AI workloads

  • felixwhynot
    link
    fedilink
    English
    arrow-up
    8
    ·
    19 hours ago

    I feel like memristors have been a buzzword for like 20 years now… am I wrong? Is it for real this time?

  • brendansimms
    link
    fedilink
    English
    arrow-up
    5
    ·
    18 hours ago

    Cool tech but very much in infancy. Good for them on getting a hype article but this in no way affects anything about computing tofay.

  • Prox
    link
    fedilink
    English
    arrow-up
    2
    ·
    16 hours ago

    Relevant

    AI boosters are no longer allowed to explain what’s good about AI using the future tense. You can no longer say “it will,” “could,” “might,” “likely,” “possible,” “estimated,” “promise,” or any other term that reviews today’s capabilities in the language of the future.

  • db2
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    21 hours ago

    Shooting a large rocket full of tech bros directly in to the sun will have a similar effect.

    • Em Adespoton@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      18 hours ago

      Shooting a rocket directly into the sun would waste as much energy as current AI data centers, because it would have to shed all the earth’s momentum.

      Better to just use a volcano.

      • db2
        link
        fedilink
        English
        arrow-up
        3
        ·
        16 hours ago

        We can sell tickets for people to take turns pushing. A dollar per participant sounds reasonable to me.