• Frezik@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    56
    ·
    24 days ago

    The patient who experienced the latter affront, a 31-year-old Los Angeles man that Tech Review identified only by the first name Declan, said that he was in the midst of a virtual session with his therapist when, upon the connection becoming scratchy, the client suggested they both turn off their cameras and speak normally.

    Instead of broadcasting a normal blank screen, however, Declan’s therapist inadvertently shared his own — and “suddenly, I was watching [the therapist] use ChatGPT.”

    “He was taking what I was saying and putting it into ChatGPT,” the Angeleno told the magazine, “and then summarizing or cherry-picking answers.”

    There has got to be some HIPAA issues with that.

    • mrgoosmoos@lemmy.ca
      link
      fedilink
      English
      arrow-up
      21
      ·
      24 days ago

      yep

      this was worse than I thought it would be

      I could understand using it as a search tool. but straight up transcribing the session and using responses… that’s fucked up