pull down to refresh
This is interesting. Why do you think that a well tuned/instructed AI is worse at (for example) writing unit tests than humans though? When I write tests, I use structural analysis and perhaps some intuition, which is arguably seen as the most human trait in cognitive skills. If the premise that intuition is actually pattern recognition is true at all, then maybe it's just a matter of developing the right model?
To be clear, I'm not saying that I'd particularly like that outcome (one of the most pleasant interactions throughout my career has been with people that found bugs in my code, so I'd consider this a real loss socially), but I do think that this is actually a reasonable outcome, and probably soon, unless there is magic going on in intuition that we don't understand and can't emulate (yet). But then, it's still only a matter of time until we do discover it?
better yet, write fucking unit tests around what the AI spit out
pro tip, get the AI to wrote the tests first
Because there's an absolute bazinga of algorithms that's a waste of time to memorize because in your carreer you'll make run into needing 3.
For those 3 times there are LLMs which serve two functions:
I'm paid six figures to be the supervisor:
Those tasks, LLMs are NOWHERE CLOSE to being able to do.