LLMs spew hallucinations by their nature. But what if you could get the LLM to correct itself? Or if you just claimed your LLM could correct itself? Last week, Matt Shumer of AI startup HyperWrite/…
another valiant attempt to get “promptfonder” into more common currency
Not a computer scientist? How the fuck do these guys get money in the first place to have startups? Its eternally frustrating dudes like this will continuously fail upwards for their entire lives.
Not a computer scientist? How the fuck do these guys get money in the first place to have startups? Its eternally frustrating dudes like this will continuously fail upwards for their entire lives.