• @cyd
    link
    English
    181 year ago

    Strange that they don’t just use an open weights model; there are several now that surpass ChatGPT 3.5, which is probably good enough for what they need.

    • FaceDeer
      link
      fedilink
      151 year ago

      Might be that they started training before those open models were available. Or they were just lazy and OpenAI’s API was easier.

    • @muntedcrocodile
      link
      English
      51 year ago

      What models have u foubd to surpass 3.5? Any outpaced gpt4 yet?

      • @cyd
        link
        English
        61 year ago

        Mistral 7B and deepseek-ai are two open-weight models that surpass 3.5, though not 4, on several measures.

      • @4onen
        link
        English
        311 months ago

        Mixtral 8x7B, just out. Codes better than ChatGPT in the few prompts I’ve done so far, and I can run it at 2 to 3 tokens per second on my GPU-less laptop.

  • AutoTL;DRB
    link
    fedilink
    English
    31 year ago

    This is the best summary I could come up with:


    TikTok’s entrancing “For You” feed made its parent company, ByteDance, an AI leader on the world stage.

    But that same company is now so behind in the generative AI race that it has been secretly using OpenAI’s technology to develop its own competing large language model, or LLM.

    This practice is generally considered a faux pas in the AI world.

    It’s also in direct violation of OpenAI’s terms of service, which state that its model output can’t be used “to develop any artificial intelligence models that compete with our products and services.” Microsoft, which ByteDance is buying its OpenAI access through, has the same policy.

    Nevertheless, internal ByteDance documents shared with me confirm that the OpenAI API has been relied on to develop its foundational LLM, codenamed Project Seed, during nearly every phase of development, including for training and evaluating the model.

    Employees involved are well aware of the implications; I’ve seen conversations on Lark, ByteDance’s internal communication platform for employees, about how to “whitewash” the evidence through “data desensitization.” The misuse is so rampant that Project Seed employees regularly hit their max allowance for API access.


    The original article contains 187 words, the summary contains 187 words. Saved 0%. I’m a bot and I’m open source!