• Fubarberry
    link
    fedilink
    English
    349 months ago

    I hadn’t really even considered that apple wouldn’t be working on their own LLM. Seems like everyone is making their own LLM these days.

      • TheRealKuni
        link
        English
        39 months ago

        Remember the early days of Apple Maps?

        If that’s an indication, Apple’s AI offerings will someday be as good or better than Google’s. Cause Apple Maps is pretty great these days, but was absolute garbage when they rolled it out.

    • @abhibeckert
      link
      English
      4
      edit-2
      9 months ago

      Apple is working on models, but they seem to be focusing on ones that use tens of gigabytes of RAM, compared to tens of terabytes.

      I wouldn’t be surprised Apple ships an “iPhone Pro” with 32GB of RAM dedicated to AI models. You can do a lot of really useful stuff with a model like that… but it can’t compete with GPT4 or Gemini today - and those are moving targets. OpenAI/Google will have even better models (likely using even more RAM) by the time Apple enters this space.

      A split system, where some processing happens on device and some in the cloud, could work really well. For example analyse every email/message/call a user has ever sent/received with the local model, but if the user asks how many teeth a crocodile has… you send that one to the cloud.

      • Fubarberry
        link
        fedilink
        English
        29 months ago

        Tbf, Google has versions of Gemini that will run locally on phones too, and their open source Gemini models run on 16GB of ram or so.