cross-posted from: https://lemmy.intai.tech/post/72919

Parameters count:

GPT-4 is more than 10x the size of GPT-3. We believe it has a total of ~1.8 trillion parameters across 120 layers. Mixture Of Experts - Confirmed.

OpenAI was able to keep costs reasonable by utilizing a mixture of experts (MoE) model. They utilizes 16 experts within their model, each is about ~111B parameters for MLP. 2 of these experts are routed to per forward pass.

Related Article: https://lemmy.intai.tech/post/72922

  • Maple
    link
    English
    14
    edit-2
    1 year ago

    “Half of those additions are censors and more creative ways to say ‘sorry, I can’t do that for you Jim.’” Lol, I’m just kidding, 1.8t parameters is incredible.

    I just really hope that it’s not as censored as it currently is. ;_;