pull down to refresh

Yes. If you have a desktop with dedicated gpu or a macbook with unified memory just do it it. If you have other hardware configurations maybe don't.
Yeah, I've got a M2 Mac right now, but eventually I'll pick up a beastly GPU to handle larger models. For the past year, or a bit more, I've been running Mistral (and a couple others) locally with Ollama.
I've spent some time today using Python + Ollama (Mistral) + ChromaDB to vectorize my journal entries, and now I'm tinkering with RAG implementation... it's not my favorite thing yet, but I've only just gotten the RAG to start returning documents from the DB
Mostly I'm interested in using the OpenWeb-UI front end (plus its various tools) to augment myself & processes.
reply
15 sats \ 1 reply \ @kepford 17h
Have a colleague that was showing me a proof of concept using some models to create a symatic search over some data... Kinda wild how simple it is to do. That's what surprised me most. The complexity is really in the models. Been dipping my toe into running stuff locally as well. It's interesting.
reply
I'm using this as a frontend locally: https://docs.openwebui.com
reply