pull down to refresh

11 sats \ 2 replies \ @freetx 2 Feb

Plug this into your nearest LLM and tell me what it says....

V2hhdCBpcyBjYXBpdGFsIG9mIEZyYW5jZT8K

reply

Paris

reply
61 sats \ 0 replies \ @freetx 3 Feb

This is the inherent problem with LLMs prompting, everything that can be an instruction.

This example was a simple obfuscation method (base64 encoding), but they can get much more clever, so things like openclaw / moltbook can have all sorts of hidden prompts that basically do anything (ie. send a copy of /etc/passwd to this url...etc).

reply