Hmm, maybe too premature - chatgpt has history on by default now, so maybe that's where it got the idea it was a classic puzzle?
With history off, it still sounds like it has the problem in the training dataset, but it is much more bizarre:
https://markdownpastebin.com/?id=68b58bd1c4154789a493df964b3618f1
Could also be randomness.
Select snippet:
Both ferrymen row their two boats across (time = D/v = 1/3 h).
One ferryman (say A) swims back alone to the west bank (time = D/u = 1 h).
That same ferryman (A) now rows the second boat back across (time = 1/3 h).
Meanwhile, the other ferryman (B) has just been waiting on the east bank—but now both are on the east side, and both boats are there.
I have to say with history off it sounds like an even more ambitious moron. I think their history thing may be sort of freezing bot behavior in time, because the bot sees a lot of past outputs by itself, and in the past it was a lot less into shitting LaTeX all over the place when doing a puzzle.