• Zacryon@feddit.org
    link
    fedilink
    arrow-up
    9
    arrow-down
    5
    ·
    11 days ago

    I’d say know your tools. People misusing “stuff” and being vulnerable to it in general is nothing new. Yet, in a lot of cases, we rely on independence and maturity in the decisions people make. This is no different to LLMs. However, of course meaningful (technological) safeguards should be implemented wherever possible.

    • Amberskin@europe.pub
      link
      fedilink
      arrow-up
      6
      ·
      11 days ago

      By their own nature, there is no way to implement robust safeguards in a LLM. The technology is toxic and the best that could happen is anything else, hopefully not based on brute forcing the production of a stream of tokens, is developer and makes obvious LLMs are a false path, a road that should not be taken.