• timberwolf1021@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    2 days ago

    What a shitty fluff article. You can only see the details once you subscribe to their shitty site.

    TL;DR: Guy asked chatgpt what you can replace chloride with in salt, and it thought he meant chemistry, so it told him other halides like bromide. It did NOT recommend him to eat it. He inferred that on his own, and then gave himself a psychiatric illness by eating buckets of it.

    This is an example of an unfortunate illness caused by human stupidity.

    Don’t blame human stupidity on LLMs.

  • nickwitha_k (he/him)@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 days ago

    Please. Please! Stop using LLMs as therapists. Talking to a poorly-drawn coconut with a face would be a healthier alternative. The reality is that, for those who do not have access to real therapy, receiving no therapy is objectively more likely to have positive outcomes than using an LLM.