This link was posted by JohnTheNerd 3 hours ago on HN. It received 44 points and 4 comments.
0 sats \ 4 replies \ @OT 15 Jun
Pretty crazy.
I guess us non-nerds are going to get something like this in 5-10 years.
reply
21 sats \ 3 replies \ @mrsu 15 Jun
Probably much sooner than you think. AI is moving pretty quick.
reply
0 sats \ 2 replies \ @OT 15 Jun
Even for self hosting like this guys doing?
reply
122 sats \ 1 reply \ @mrsu 15 Jun
Yes. Its getting much easier. You can easily spin up a back end now without much tech knowledge. You don't even need a GPU (although that speeds things up significantly).
If you're interested, look into Ollama or LM Studio. They provide APIs to interface with the LLMs. Then there are a bunch of clients you can install and point to this endpoint.
reply
0 sats \ 0 replies \ @OT 15 Jun
Thanks
I’ll have a look
reply