• Cracks_InTheWalls@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    13 days ago

    I’d argue it has promise, but

    a) ChatGPT ain’t it, any LLM technology for therapeutic purposes needs some serious fucking guard rails, both in terms of privacy AND addressing the sycophant and hallucination problems, and

    b) it really should only be one tool within a larger therapeutic program - think like an interactive version of CBT worksheets, or a first session intake form that MIGHT serve up some very basic, low risk techniques to try before getting assigned to a flesh-and-blood therapist. Heck, one of the things that popped to mind was maybe improving initial patient-therapist matches (if managed by a larger mental health organization/group of therapists), reducing the need to shop around which is often a big barrier to starting effective treatment. Folks seem to open up a lot when using these tools, and a review of those transcripts in the intake process could be very useful for assigning patients.

    Current consumer LLM tools as simulated therapists without oversight by actual mental health professionals is a fucking nightmare, no argument here. But at minimum, we’re seeing evidence that patients who otherwise eschew traditional therapy, either for financial reasons or other factors, are using it. I think there’s something useful here if you can correct for the current risks and get the right people involved re: design and deployment within a larger therapeutic program.

    I can’t imagine someone somewhere isn’t doing some work with this in mind right now. How that would all pan out, idk.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      13 days ago

      Some in the enthusiast community (who have a good grasp of how LLMs work because we literally finetune them) have used “straight” finetuned LLMs as personal pseudo therapists. Two examples I can think of are the Samantha series (trained on psychologist/therapist transcripts and some other specialized data IIRC), and Dan’s Personality Engine.

      I do, sometimes.

      They aren’t therapists, but at the same time are 100% private, free, available, and “open” to bouncing ideas off of at any time. I’ve had some major breakthroughs a lifetime of on/off therapy missed, like that I’m likely on the autism spectrum (which was purely diagnosed as ADD before). I discuss things I would never send to ChatGPT, post on a public forum like this (no offense), or even tell a therapist in some cases.

      I’m not saying this is great as-is for the general population. Again, there is a extremely high awareness of “what” they are and their tendencies among LLM tinkerers, but the latent potential is there.