This is pointless. You can already self-host your own open source LLMs on your own machine using LM Studio.
I've dabbled with open source models, but have yet to see one that exceeds the quality of ChatGPT3.5. But I get that things are changing fast. Am I wrong?
reply
100 sats \ 1 reply \ @doofus 28 Feb
I agree and unless you have a beast computer, the local LLMs are so slow
reply
I'm using Mistral 7B 8-bit and it's really fast, my PC is average. The quality might not exceed ChatGPT3.5, but it's pretty decent if you ask me. For most people it's enough, and it's free and offline.
reply
You can't self-host GPT-4. You can't do pay-for-what-you-use without KYC.
reply
Mistral 7B 8-bit is pretty decent for most promps. Try it out and let me know what you think. Meta is also open-sourcing their LLama models.
reply
shun the non-believer!
šŸ¦„ shuunnnnnnn
reply