• Ech@lemmy.ca
    link
    fedilink
    arrow-up
    10
    arrow-down
    3
    ·
    3 days ago

    assuming it’s not an AI hallucination.

    All output from an LLM is a “hallucination”. That’s the core function of the algorithm.

    • julietOscarEcho@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      I was a computer scientist at a time when early generative AI work refered to output as the model “dreaming”. Makes it sound kind of sweet. It was viewed as kind of kooky to run pattern recognition models forward…