Mine attempts to lie whenever it can if it doesn’t know something. I will call it out and say that is a lie and it will say “you are absolutely correct” tf.

I was reading into sleeper agents placed inside local LLMs and this is increasing the chance I’ll delete it forever. Which is a shame because it is the new search engine seeing how they ruined search engines

  • rozodru@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    18 hours ago

    Because it’s trying to reach the solution as quickly as possible. It will skip things, it will claim it’s done something when it hasn’t, it will suggest things that may not even exist. It NEEDs to reach that solution and it wants to do it as efficiently and as quickly as possible.

    So it’s not really lying to you, it’s skipping ahead, it’s coming up with solutions that it believes should theoretically work because it’s the logical solution even if an aspect to obtaining that solution doesn’t even exist.

    The trick is to hold it’s hand. always require sources for every potential solution. Basically you have to make it “show it’s work”. It’s like in High school when your teacher made you show your work when doing maths. so in the same way you need to have it provided its sources. If it can’t provide a source, then it’s not going to work.