Pretty crazy.
I guess us non-nerds are going to get something like this in 5-10 years.
Probably much sooner than you think. AI is moving pretty quick.
reply
Even for self hosting like this guys doing?
reply
122 sats \ 1 reply \ @mrsu 15 Jun
Yes. Its getting much easier. You can easily spin up a back end now without much tech knowledge. You don't even need a GPU (although that speeds things up significantly).
If you're interested, look into Ollama or LM Studio. They provide APIs to interface with the LLMs. Then there are a bunch of clients you can install and point to this endpoint.
reply
Thanks
I’ll have a look
reply