• wischi@programming.dev
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    2 days ago

    To give the (anthropomorphized) model credit, I think the biggest problem is that the way it’s trained is has practically no concept between the relation of the token it spits out an the html code that will be produced by the ChatGPT UI wrapper.

    Because of that inserting regular newlines often don’t work because a single newline in markdown doesn’t translate to a line break in HTML.

    To force a particular structure I often ask it to use some will known formats like yaml. Because there is so much yaml training data, is practically impossible for the LLM to not add newlines at the correct places (which is also typically rendered correctly because it places that in a markdown code block)

    • Evotech@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      2 days ago

      Yeah, I got it to work eventually. It’s just a fun issue. Since it had no problem telling me what’s wrong and how to fix it, it just couldn’t