pull down to refresh

Onyx now runs Llama 3.2 3B locally on your phone - enabling free and private chats!
You can also chat with the larger Llama 3.3 70B by connecting via MCP to a local Pylon node, which can load any model supported by Ollama.
Check the video for demos of both. First alpha builds launch in ~1 week for Android and iOS.
With a mix of on-device, local or distributed servers, and cloud APIs, we have a LOT of flexibility for building agentic workflows using any combination of tradeoffs for privacy, speed, cost, and censorship resistance.
Combined with our DVM marketplace from Ep. 142 and bitcoin wallet from ep. 143, we've got everything we need to bootstrap a decentralized marketplace of agentic AI services.
That wraps up our first full week of '12 Nights of OpenAgents'. See you Monday!
this territory is moderated