Imagine on iPhone Apple Intelligence or on Android something Ollama-esque that was actually good. Private. On device. Every question queries the relevant data via API/SQL from your phones data. From your notes, journal, photo library, health data, chats, emails. And uses them just like e.g. chatGPT uses websearch.
Would you use it? Would you take a pic of a restaurants menu and a year later ask your phone what kinds of Burgers they had? Would you journal down random bits of information throughout the day and when necessary ask your new second brain stuff you wanted to remember?
Or would you not use it and continue using the device the conventional way?
proofread
feature as a free Grammarly on my phone. It helps me solve some grammar dilemmas. That's it.camera
ormicrophone
.Amber
doesn't need LLM, so that doesn't need the permission.Obsidian
could use LLM, so that does need the permission, optionally, and when I enable it, it will use it.knowledge
cache in the same way, so that an app (not a centralized process) can submit new knowledge (for processing and then caching) and query it, much like your "second-brain" idea:llm-to-app
interface is both more powerful and riskier thanapp-to-llm
, but,app-to-llm
is easier to both standardize and optimize. I think it really depends on what you want to achieve.Footnotes
<bad app>
and prevented" message from GrapheneOS, just like I've always loved SELinux, despite its complexity. It's always nice to have OS-level (and hardware) protections against naughty software. ↩