pull down to refresh

It's easy to guard against AI hallucination if you just freakin check their output and don't be lazy.
I would never catch up with LLMs spitting out vibe code if I would seriously review & test the code though. This is why it's unsuitable for (a) open source projects that are depended upon for production use (i.e. libraries or daemons) and (b) anything that carries liability - ask lawyers about hallucinations, lol.
If a programmer or lawyer puts out hallucinated work it's on them and they should be fired
reply