pull down to refresh
0 sats \ 6 replies \ @JuanMiguel 28 Feb freebie \ on: Introducing PayPerQ (ppq.ai), your new default GPT4 experience, powered by LN bitcoin
This is pointless. You can already self-host your own open source LLMs on your own machine using LM Studio.
I've dabbled with open source models, but have yet to see one that exceeds the quality of ChatGPT3.5. But I get that things are changing fast. Am I wrong?
reply
reply
I'm using Mistral 7B 8-bit and it's really fast, my PC is average. The quality might not exceed ChatGPT3.5, but it's pretty decent if you ask me. For most people it's enough, and it's free and offline.
reply
You can't self-host GPT-4. You can't do pay-for-what-you-use without KYC.
reply
Mistral 7B 8-bit is pretty decent for most promps. Try it out and let me know what you think. Meta is also open-sourcing their LLama models.
shun the non-believer!
š¦ shuunnnnnnn
reply