Inside the GPT-3

  • Behohippy
    link
    41 year ago

    I’ve got a background in deep learning and I still struggle to understand the attention mechanism. I know it’s a key/value store but I’m not sure what it’s doing to the tensor when it passes through different layers.

    • Varun
      link
      fedilink
      11 year ago

      @behohippy @saint Instead of timestep by timestep sequence modeling the attention allows us to pass sequential model in a parallel NN just like fully connected one, where the positional encoding helps us to know the sequence of each and we can remove the keys having less attention value…

  • @kromem
    link
    31 year ago

    What are you eating which needs that large of a napkin?