pull down to refresh

A chatbot can't lie. Lying suggests intent. AI's have zero intent. Its becoming clear to me that this is a fundamental misunderstanding the masses have about machines and today AI.
They do what they were designed to do. They know nothing. To spit out what they have been programmed to spit out. It statistical text generation based on a prediction model.
True, but from a user’s side it can feel like lying when it confidently gives wrong answers. The tricky part is people think it “knows” things, when really it’s just predicting words that sound right.
reply
Exactly. Its predicting based on how it was trained. Its not checking its work. It just not designed to do that. This is why Agentic is the buzz word. You can have one bot generating text and another checking it. But the one bot you interact with if that functionality is baked in isn't aware of anything. Its guessing.
The problem is not with AI. Its with the deception coming from people like Scam Altman.
reply