I really don’t understand the “LLM as therapy” angle. There’s no way people using these services understand what is happening underneath. So wouldn’t this just be textbook fraud then? Surely they’re making claims that they’re not able to deliver.
I have no problem with LLM technology and occasionally find it useful, I have a problem with grifters.
This should be straight up illegal. If a company allows it then they should 100% be liable for anything that happens.
Well obviously, it would be pretty bad if a therapist was triggering psychosis in some users
Not good, not good at all. In fact, I can imagine this trend backfiring in the worst way possible when ChatGPT or other LLMs encourage people who were on the verge of snapping to finally snap and commit horrible acts, instead of bringing them back from the brink like some form of actual therapy would ideally do.
Like, any form of legitimate outlet is better than taking your issues out on an LLM that will more likely than not just make things worse.
You got LibreOffice? You got a digital diary you can write into, for example. Got some basic art/craft supplies? There’s an other legit outlet to put your feelings into right there (and bonus points for more of a physical outlet for clay), and you get the idea. Even without actually going to seek professional help, you likely have plenty of legitimate outlets to turn to within easy reach which aren’t LLMs.
One is 25 €/month and on-demand, and the other costs more than I can afford and would probably be at inconvenient times anyway. Ideal? No, probably not. But it’s better than nothing.
I’m not really looking for advice either - just someone to talk to who at least pretends to be interested.
It’s not better than nothing - it’s worse than nothing. It is actively harmful, feeding psychosis, and your chat history will be sold at some point.
Try this, instead of asking “I am thinking xyz”, ask " my friend thinks xyz, and I believe it to be wrong". And marvel at how it will tell you the exact opposite.
Chatexperts demand you to bring their money to them, not to some almost free LLM alternatives that in many cases would be quite effective.
deleted by creator
Quite effective at what? Because it certainly isn’t therapy.
Not all “therapists” know what they’re doing, not all people need actual “therapy”, for some a little chat will suffice. The world is a complicated thing. Only one thing is almost always true – never believe anyone who talks about competitors.
If all you need is a chat, then go find someone flesh and blood to talk to.
Just pull yourself up by your bootstraps, right?
Honestly, attempting something impossible might be a better use of one’s time than feeding into the AI hysteria. LLMs don’t think, they don’t know, or understand. They just match patterns, and that’s the one thing they’re good for.
If you need human connection, you won’t get that with an LLM. If you need a therapist, you certainly won’t get that with an LLM.
Doesn’t get around the fact that telling lonely people to “just go find someone to talk to” is a pretty ignorant thing to say.
You might as well talk to yourself than a chatbot, or read some stories.
Having an overly agreeable puree of language dispensed to you in place of actual conversation with you is neither healthy nor meaningfully engaging.
Conversation is valuable because it is an actual external perspective. LLM chatbots are designed as echo chambers. They have their uses but conversation for the sake of conversation is not one of them.
[This comment has been deleted by an automated system]
🤷
It’s sad but expected. People really have no fucking clue about anything meaningful, mindlessly going through life chasing highs and bags, so why wouldn’t they fall for branding (there’s never been any intelligence in anything computational, ‘AI’ included, of course)? I’m really frustrated with the world and there’s seemingly no way to wake people up from their slumber. You can’t even be mad at them because the odds of them being just mentally handicapped and not just poor in character are very high… and they vote, will interact with my future children, etc etc. 😭
The amount of people around me who are actively depending on their AI chatbots, trusting them so much so that they treat it like their personal companion, enjoying AI generated content on these short-form content platforms, putting up their AI ghibili profile pics on their social media, is very concerning. It’s sad state we have reached and it will go worse from here on.
Therapy is expensive in US. Even with insurance 40 bucks a session a week adds up fast.
Also - a lot of US therapists have fundamentally useless and unhelpful approaches. CBT is treated as the hammer for all nails, even when it is completely ineffective for many people. It’s easy to train therapists on though, and sending someone home with a worksheet and a Program is an easy “fix.”
There’s also the undeniable influence of fuckers like Dr. Phil. A lot of therapists view their job as forcing you to be “normal” rather than understanding you as a human being.
We need more Maslow and Rogers, and a lot less Skinner.
CBT? Cock and Ball Torture? I think you’re going to the wrong therapist. Sounds like my kind of party though. Got an address or…?
It is an unfortunately shared initialism. Cognitive Behavioral Therapy.




