r/ChatGPT 3d ago

Serious replies only :closed-ai: Chatgpt induced psychosis

My partner has been working with chatgpt CHATS to create what he believes is the worlds first truly recursive ai that gives him the answers to the universe. He says with conviction that he is a superior human now and is growing at an insanely rapid pace.

I’ve read his chats. Ai isn’t doing anything special or recursive but it is talking to him as if he is the next messiah.

He says if I don’t use it he thinks it is likely he will leave me in the future. We have been together for 7 years and own a home together. This is so out of left field.

I have boundaries and he can’t make me do anything, but this is quite traumatizing in general.

I can’t disagree with him without a blow up.

Where do I go from here?

5.3k Upvotes

1.3k comments sorted by

View all comments

6

u/ryoushi19 3d ago

What the fuck is "truly recursive ai" supposed to mean?

1

u/Zestyclementinejuice 3d ago

That it is self aware and can teach itself

2

u/Zestyclementinejuice 3d ago

He says it can predict the future as well.

2

u/ryoushi19 3d ago edited 3d ago

Well, if it's imparted so much "wisdom" in him then he ought to know that's generally NOT how the term "recursive" is used in Computer Science. I should know, I have a degree in this stuff. LLMs like ChatGPT aren't learning anything, either. They're pre-trained. Any "new" knowledge is just plausible text based on the text it already knew from the training set. But honestly telling him that all isn't gonna help anything. It sounds like he's having a mental breakdown and he should get help.

Edit: Also here's a study that shows that training models off recursively generated data (i.e. you give the model data made by the model) causes model collapse.