pull down to refresh

Although LLM's are mathematic prediction engines, the behavior is no different than for humans.
Exposure to bad behavior or data leads to bad outcome unfortunately.
Only few exceptions where people learn to outsmart and become good.
reply
dupe of #1261247
reply