• 0 Posts
  • 74 Comments
Joined 1 year ago
cake
Cake day: July 23rd, 2024

help-circle
  • denial@feddit.orgtoFuck AI@lemmy.world"phd-level reasoning"
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    12 days ago

    I think you make it too complicated.

    The question / prompt is very simple. The answer is “one trip”. The LLM stumbles because there are trigger words in there that make it seem like the goat cabbage puzzle question. But to a human it clearly is not. An LLM on the other hand cannot tell the difference.

    It may be tricking the LLM somewhat advesarially. But it is still a very simple question, that it is not able to answer, because it fundamentally has no understanding of anything at all.

    This prompt works great to drive home that simple fact. And shows that all that touting of reasoning skills is just marketing lies.