pull down to refresh

This was really helpful

Thanks a lot

reply

I wish you nothing but success in your endeavors.

reply

I've followed the docs and still ffailing to get the chat to work

Is there something missing

reply

Tell me what you've done so far.

reply

Downloaded Ollama 3.9.1
Downloaded codellama:7b-instruct
Downloaded codellama:7b-code

Restart my PC and still didn't get the 🤖icon in my vsCode

reply

I think the icon might have changed. It should be beside the extension button.

You'll have to configure twinny to use the local llm.

https://twinnydotdev.github.io/twinny-docs/general/providers/

reply

yh, read through this and used default(Ollama), and when I call it on my terminal it responds, but asides that,nothing else when it comes to the extension tab

Also, from the img above, am I missing something in the config?

reply

Try 127.0.0.1 instead

reply
127.0.0.1
pls can we conclude this on hivetalk.org

Cos then I can share my screen with you

Cool?

reply

Did you manage to figure it out?
I took a look at the pic, and I think you are missing an API key from ollama.

Driving today, but maybe Monday.