A simple proof that GPT might be knowledgeable, but is also dumb.

Two words: cryptic crosswords. Most of the clues I've given it to try to resolve have devolved into nonsense answers, nonsense reasoning, or both.

I'm sure it could be fine-tuned to solve cryptic crosswords, but I think this does pose some interesting questions into the nature of its 'intelligence' out of the box in an AGI sense.