pull down to refresh

about half of the posts containing LLM-generated comments were deleted for unknown reasons. “It’s very weird to have basically half of your data go missing after the treatment,” Altay says. “It really prevents causal inference.”
reply
46 sats \ 0 replies \ @optimism 12h
"I think people have a reasonable expectation to not be in scientific experiments without their consent"
You think?!? haha.
I shall now train my LLM on the methods employed by the LLM used by the researchers... to learn how LLMs can be used to influence people's behavior. See also #1014994 for how we could perhaps detect behavior and who would be the main targets of said influence.
reply