pull down to refresh

🎯 .. LLMs are very adept at matching the intent of what you're looking for to actually useful information. There is less "skill" required to engineer "the perfect search term". You can ask the question the way you would ask it and it will generally get you close to the target within one or two volleys.
As I covered in this one minute section from @jsonbits meetup - the reason for this is because vectorization allows for the "quantization of language". Meaning you can ask the same question 40 ways with different vernacular or perhaps even different languages and still get a similar answer that's meaningful & useful! It's really a revolution in flexibility computational systems above everything else.