How often is the bat and ball problem in chatGPTs training data?
I’d still argue that it can reproduce the solution it learned, not actually solve it.
J.says:
Indeed. DuckDuckGoing for “Bat and Ball Problem” gives several hits with the precise formulation noted here. Curious whether it would respond correctly if you replaced “Bat and Ball” with “Keyboard and Mouse”, and “$1.10 and $1.00 dollars” with “$50 and $10”, or some other arbitrary formulation?
Nevertheless, ChatGPTs knowledge of obscure math problems is interesting, though not particularly indicative of its intelligence (similar to some people I encountered at university.)
How often is the bat and ball problem in chatGPTs training data?
I’d still argue that it can reproduce the solution it learned, not actually solve it.
Indeed. DuckDuckGoing for “Bat and Ball Problem” gives several hits with the precise formulation noted here. Curious whether it would respond correctly if you replaced “Bat and Ball” with “Keyboard and Mouse”, and “$1.10 and $1.00 dollars” with “$50 and $10”, or some other arbitrary formulation?
Nevertheless, ChatGPTs knowledge of obscure math problems is interesting, though not particularly indicative of its intelligence (similar to some people I encountered at university.)