pull down to refresh
0 sats \ 0 replies \ @Scoresby 3h \ on: Logs Show ChatGPT Leading a Vulnerable Man Directly Into Severe Delusions AI
If you asked chat to come up with an aspirational slogan for itself this is probably not far from what it would produce.
Can something like this be called self harm? If chat doesn't have some sort of consciousness (I don't think it does, and it doesn't seem like many other people do, either), then the only thing that creates the responses here is the user.
Maybe "sycophancy" isn't the right word, either. Sycophancy implies some agency or intent, when what seems to be happening is that chat responds to an input with the most likely response, which it seems was determined by weighting to make the thing have good customer service.
It seems to me that the real culprit here is people thinking a product is far more capable, reasonable than it is.