I hate these examples. I feel they do such a bad job of showing that LLMs are lying machines and just make the journalists look like pedantic idiots. There is a perfect example of LLMs being lying machines and its from the GPT 5 presentation. GPT 5 tries to explain how lift works like a PHD holder and (according to what i've read from the people who actually have a PHD) it gets it completely wrong. Its explanation is incorrect, the wing shape it uses as an example is incorrect. It just all around fails at providing a correct explanation.
I asked it the same question and it answered correctly then tried 5 or six other random word examples and every one was correct.