- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
cross-posted from: https://lemmy.zip/post/27030131
The full repo: https://github.com/vongaisberg/gpt3_macro
cross-posted from: https://lemmy.zip/post/27030131
The full repo: https://github.com/vongaisberg/gpt3_macro
Looking at the source they thankfully already use a temp of zero, but max tokens is 320. That doesn’t seem like much for code especially since most symbols are a whole token.