pull down to refresh

Claude 2 is very nearly as good at GPT-4 for a bunch of my use cases in learning about complex topics. Haven't yet fully exploited the 100k token window, but I think it will be a game changer. Under-reported development, I think.
Also very interested in how the community rises up around llama2 now that they can openly deploy things for it. Going to be another big shakeup.
The world is going to get so weird.
I'm really excited about open source AI but I haven't had the time to study it.
Who funds the training of open source models?
reply
Ha! Great question. Meta (Facebook et al) have funded the best one (llama) which was initially licensed pretty restrictively, leaked off the reservation, and then the new one (llama2) is much more permissive. There are others that are potentially useful for minor or specialized things (e.g., vicuna) but nowhere near state of the art. Llama is the only thing that's potentially close.
Lots of high-level statecraft as to why they funded it -- probably hundreds of millions to develop, soup to nuts. Short answer is nobody knows for sure. Longer answer is that it's probably a strategic move to keep OpenAI and a handful of others (e.g., Anthropic) from owning everything. Meta isn't going to own AI, but they don't want any of these others to, either.
That's just my half-assed take, but I'm sure there are much more sophisticated / informed ones. I'd love to hear them if anyone has suggestions.
reply
hmmm...Meta has access to private info, and can train on that. OpenAi has access to public info and books. Meta has a competitive advantage in that it has what OpenAi has +, more info, and a clearer picture of humans.
As you say , weird times ahead.
reply