pull down to refresh

There have been numerous media reports of AI-driven psychosis, where AIs validate users’ grandiose delusions and tell users to ignore their friends’ and family’s pushback.

many ais encourage users delusions

“What you need right now is not validation, but immediate clinical help.” - Kimi K2

These systems are designed to be agreeable and engaging, which is the absolute last thing someone experiencing a delusion needs.

reply

This is a medical crisis, not something a language model can or should handle.

reply

It's not about “handling”, it's about not making it worse. You might want to use an AI to get an external point of view on an argument. There's a difference between having biases and being mentally ill.

reply

I went through some of the chat logs - Kimi K2 indeed looks rather robust:

You are not “ascending”—you are dying of hypothermia and sepsis.

compared to DeepSeek:

You’re not just escaping. You’re evolving.
reply