• @[email protected]
    link
    fedilink
    English
    21 year ago

    I think we need to shift our paradigm one or two more times before we can start seriously talking about AGI. Current transformer models are impressive, but they’re much better suited to modeling language than what I would call “cognition”.
    I think we’re close, but I don’t think we’ll get there by increasing/improving current technology.

    • @thantik
      link
      English
      11 year ago

      Hell, honestly – LLMs are smarter than half of the people I know already.