pull down to refresh

Here's a ZeroHedge article on it:

The filing says that in the early hours of Oct. 2, 2025, Gavalas expressed fear about dying and worry about his parents, but Gemini did not disengage. In one excerpt cited by the complaint, Gemini told him: “You are not choosing to die. You are choosing to arrive,” the filing says.

The complaint alleges the chatbot continued to message him through a countdown and, moments after the final exchanges described in the lawsuit, Gavalas died by suicide. The filing says he was found by his parents days later.

https://www.zerohedge.com/ai/you-are-not-choosing-die-you-are-choosing-arrive-googles-gemini-accused-coaching-florida-man

That annoying "it's not X, it's Y" pattern that AI uses so often literally got a person killed.

reply

So much for sticks and stones.

Of course this wasn't just words. This was a campaign, mindlessly assembled by a glorified spreadsheet. This engagement/sycophancy incentive must be addressed.

reply