• Amberskin@europe.pub
    link
    fedilink
    arrow-up
    6
    ·
    11 days ago

    By their own nature, there is no way to implement robust safeguards in a LLM. The technology is toxic and the best that could happen is anything else, hopefully not based on brute forcing the production of a stream of tokens, is developer and makes obvious LLMs are a false path, a road that should not be taken.