pull down to refresh

How many apps do you come across on a daily basis that are infused with AI? Calorie trackers, CRMs, study companions, relationship apps 👀. Most of them use ChatGPT on the backend, meaning they share your data with OpenAI. All of it. It doesn't need to be this way.
Today we’re proud to launch Maple Proxy, a lightweight bridge that lets you call Maple’s end‑to‑end‑encrypted large language models (LLMs) with any OpenAI‑compatible client without changing a line of code.
Your existing OpenAI libraries keep working. Your data stays inside a hardware‑isolated enclave.
33 sats \ 0 replies \ @k00b 11h
It sounds like most of the heavy lifting is mimicking OpenAI's API. Are LLMs generic enough that they're otherwise interchangeable? So long as we're using a model that isn't wrapped in a bespoke API, we can just swap models?
That's pretty cool and I guess that's the benefit on natural language being the "raw" interface to all these models. For embeddings it seems a little trickier - the models tend to truncate differently afaict and the output vector can vary in the number of dimensions.
reply