pull down to refresh

Wow! There were times when that would have come in very handy, long ago! I suppose they do life hacks, too.

Whenever googling coding issues, there is a very high likelihood of ending up on SO. ChatGPT has probably been trained on SO (even though when asking ChatGPT it claims it hasn't~~).

reply

Of course it would claim it hasn’t! The company is in ongoing court hassles over training materials. I would bet that ChatGPT denies all training materials if asked directly, except those that are free and open.

reply

It'd be fun to find a jailbreak forcing it to answer truthfully to this question~~

reply

I guess there are jailbreak prompts, but I have never used ChatGPT, myself.
I have looked at other’s usage, though. Apparently you have to set a scenario for the jailbreak, and then make your query to the jailbroken AI.

reply