pull down to refresh

50 sats \ 0 replies \ @freetx 1h

This is the inherent problem with LLMs prompting, everything that can be an instruction.

This example was a simple obfuscation method (base64 encoding), but they can get much more clever, so things like openclaw / moltbook can have all sorts of hidden prompts that basically do anything (ie. send a copy of /etc/passwd to this url...etc).

reply