pull down to refresh

Fake news.... I just asked it "how many b's are in the word blueberry" and it answered 2.
I just got it to respond incorrectly to how many r in congratulations https://chatgpt.com/share/689a478b-e6ec-8008-b45b-4fe285cb27a2 even when I said to check with python
This is old explanation of why it gets it wrong. https://www.reddit.com/r/OpenAI/comments/1haxhjk/can_someone_explain_exactly_why_llms_fail_at/ I also noticed it seems to do better if it "thinks"
reply
0 sats \ 1 reply \ @ken 2h
Maybe a bug in Python?
Or perhaps the word "congratulations" really does have 2 Rs
Either way, I would just trust the language model. Who are we to criticize it?
reply
Maybe ChatGPT is Japanese
reply
reply
If it depends on a specific form of imputation to respond correctly, it is because it continues to be what it always was, a calculation presented in the form of language and not something truly intelligent.
reply