• hexaflexagonbear [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    15 days ago

    I was making chatgpt do some tedious thing and I kept telling it “you got X wrong” and it kept going “oh you’re right I got X wrong, I will not do that again” and giving the exact same output. lol the one time ChatGPT was giving me consistent outputs for the same prompt

    • Natanox@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      15 days ago

      Yeah, same with Codestral. You have to tell it what to do very specifically, and once it gets stuck somewhere you have to move to a new session to get rid of the history junk.

      Both it and ChatGPT also repeatedly told me to save binary data I wanted to store in memory as a list, with every 1024 bytes being a new entry… in form of a string (supposedly). And the worst thing is that, given the way it extracted that data later on, this unholy implementation from hell would’ve probably even worked up to a certain point.