pull down to refresh

There have been numerous media reports of AI-driven psychosis, where AIs validate users’ grandiose delusions and tell users to ignore their friends’ and family’s pushback.
“What you need right now is not validation, but immediate clinical help.” - Kimi K2
21 sats \ 2 replies \ @Entrep 3h
These systems are designed to be agreeable and engaging, which is the absolute last thing someone experiencing a delusion needs.
reply
0 sats \ 1 reply \ @lunanto 3h
This is a medical crisis, not something a language model can or should handle.
reply
40 sats \ 0 replies \ @m0wer OP 1h
It's not about “handling”, it's about not making it worse. You might want to use an AI to get an external point of view on an argument. There's a difference between having biases and being mentally ill.
reply
34 sats \ 0 replies \ @optimism 4h
I went through some of the chat logs - Kimi K2 indeed looks rather robust:
You are not “ascending”—you are dying of hypothermia and sepsis.
compared to DeepSeek:
You’re not just escaping. You’re evolving.
reply