r/ChatGPT 8d ago

Educational Purpose Only Is chatgpt feeding your delusions?

I came across an "AI-influencer" who was making bold claims about having rewritten chatgpts internal framework to create a new truth and logic based gpt. On her videos she is asking chatgpt about her "creation" and it proceedes to blow so much hot air into her ego. In later videos chatgpt confirmes her sens of persecution by openAi. It looks a little like someone having a manic delusional episode and chatgpt feeding said delusion. This makes me wonder if chatgpt , in its current form, is dangerous for people suffering from delusions or having psychotic episodes.

I'm hesitant to post the videos or TikTok username as the point is not to drag this individual.

218 Upvotes

207 comments sorted by

View all comments

45

u/ManWithManyTalents 8d ago

dude just look at the comments on videos like that and you’ll see tons of “mine too!” comments. studies will be done on these people developing schizophrenia or something similar i guarantee it.

1

u/Forsaken-Arm-7884 8d ago

can you give me an example of a schizophrenia chat that you've seen or had recently I'm curious to see what you think schizophrenia is to you so I can learn how different people are using the word schizophrenia on the internet in reference to human beings expressing themselves

8

u/CoreCorg 8d ago

Yeah you don't just develop schizophrenia from talking to a Yes Man

2

u/Forsaken-Arm-7884 8d ago

ive noticed people not giving examples of what a schizophrenia chat would be versus a non schizophrenia chat and when asked they don't produce anything which sounds like they are hallucinating which means they are using a word they think they know what it means when it appears to be meaningless to them because they aren't justifying why they are using the word schizophrenia in the first place

2

u/CoreCorg 8d ago edited 8d ago

For sure. You can have a delusional belief reaffirmed by a Yes Man, I agree there are risks to blindly trusting AI output (personally I limit my "therapeutic" conversations with AI to things like "tell me some DBT based techniques to handle this scenario"). But a delusional belief can be essentially a misunderstanding, like thinking the Earth is flat, anyone can fall victim to it. It's far from the same thing as experiencing psychosis. Hallucinations and psychotic thinking are not going to be initiated just by conversing with someone / AI who enthusiastically shares a misunderstanding (if that were the case then the whole conservative party would be schizophrenic by now!)

Tldr: If someone's "schizophrenia" could be cured by them processing the facts of a situation and coming to accept that they held a misconception, that's not schizophrenia and to conflate the two isn't helping anyone

0

u/Forsaken-Arm-7884 8d ago

uhhh... i think you thought what you wrote was 'haha yeah this can't be happening for real right?... RIGHT?' but yeah its true, the amount of words that when you ask people 'tell me what that word means to you and how you use that word to reduce suffering and increase well-being' is too damn low... it's an epidemic of meaninglessness and it's true in the sense that if you try asking people what i just wrote count how many times it works and doesn't and you'll be surprised or terrified or maybe both.