Office space meme:
“If y’all could stop calling an LLM “open source” just because they published the weights… that would be great.”
Office space meme:
“If y’all could stop calling an LLM “open source” just because they published the weights… that would be great.”
Training code created by the community always pops up shortly after release. It has happened for every major model so far. Additionally you have never needed the original training dataset to continue training a model.
So, Ocarina of Time is considered open source now, since it’s been decompiled by the community, or what?
Community effort and the ability to build on top of stuff doesn’t make anything open source.
Also: initial training data is important.