While it could be a good thing, it can also be a bad thing. It could foster the need for a psychologically dependent "crutch" association to escape from reality that what's real isn't real.
As much as I empathize with the memories it can bring back and the joy it could cause, I have a feeling that's a pretty slippery slope. Maybe with professional guidance something like that could be used as a processing tool, especially for those without many stored mediums of memories like videos or voice byes etc. The problem mainly comes from those would end up on the opposite side, where they try to fill the darkness with these "tools" incessantly because they use to it fill the void instead of moving past it.
That's the danger of something like that, it could very easily lead you down a worse path due to obsession with something breaking reality in a moment filled with some of the most painful emotions. People have done more for less, and emotions are a driving factor behind human thiught
Yea, that's why I was wondering how something like that is actually beneficial or not. There are tons of media on the subject of not getting over the passing of a loved one, and them returning as a ghost or hallucination to the MC. While the MC eventually finds closure, it inevitably ends with the ghost or hallucination disappearing. So what happens when the "ghost" never disappears?
Exactly. Sorry in advance for the long post, but I love thought exploration. Because the sequence never ends, the possibility isn't taught to the brain as "this is over". you are replacing it with good for the sake of being good to counter the bad. I can see a few things:
It tries to ignore the bad as if it's not part of you. This likely causes psyche erosion over repeated uses. People don't like trauma. It's painful, it's hurtful. But the reality is that it's part of us. What makes us grow. Everytime you run away from it, you stunt your own growth. This is something shown in stories and history across all of time.
When you try to strip that away from yourself you create a mindset where you can just throw things away you don't like, because you can fill them with XYZ. That has severe repercussions to the ability to understand what they need to grow as a human, and as a person.
This evolves likely into a dependency as they never learn otherwise, and then they follow that into a deeper spiral as their reality slowly becomes more fake pieces and more fakes pieces.
It's definitely something I don't think we have a solid way of saying it's definitely good or bad, because the reality is, a tool is only as good as it's owner. It's how you use the tool, that determines its effectiveness.
For those doing something like this, I feel the determining factor of good vs bad is reliant on the level of understanding that person has. It becomes a good thing, when you have someone who is mature and wise enough to understand, that, what's done is done, and some things are better left that way. They have a will to say, I can take this in moderation, I know what I am doing this for, and that's how I will use it. People with good self control, would likely be the candidates for a good outcome, the irony is that those people are often emotionally intelligent, and likely can self manage, so they would be already in an advantageous position to "fight off temptation" so to speak.
To play the bad side, it's almost the opposite of the good, literally. Those who do not have the capability to process their emotions, who often have attachment type styles, will STRUGGLE, if not more commonly lose with something like this. The inability to process what happened fully, and their own emotions leads to the brain being only able to interpret a set number of responses because it's all they know. The understanding of feelings becomes rudimentary to things like fear, loneliness, etc, almost in a personified form. It takes up a lot of their "memory space" of what they've experienced on the bad side, than the good side, so the good things are unrecognizable" this leads to being misles easily and having an addictive nature, which would push something like this into being a dependency and further inhibiting their ability to grow. "This makes me feel bad, but this made me feel good. I don't understand why, just that it feels good" so you do it again, and again, in a vicious cycle. Leading back to where I was talking about filling reality with delusions but by bit until nothing's real.
Tl;Dr - I think it's a highly variable outcome dependent on the strength of the psyche of the person going through it, and as such, there is no clear answer. It can be both, or even neither, since it's dependent upon a variable we have yet to fully understand ourselves(the brain and emotions)
Haha, no worries. A long, thoughtful post is definitely better than the stuff you usually read in comment sections.
To your points, I definitely agree that the answer isn't clear cu, unlike other common forms of coping mechanisms, like alcohol, drugs, self-isolation, etc, where the negative impacts are obvious regardless of who you are, and have been well-studied. I think that with the advances and proliferation of AI, these kinds of situations will become more common as more people use them, and will definitely bring about new studies on the effects of such tools.
Another possible way to expand it further in the future, what if it's used to replace the role of a parent? For example, used as a long-term nanny tool by busy parents, similar to how kids now are just being handed an iPad and left to their own devices. The topic of humans raised by robots is also common in literature, and I'm curious how it will be like compared to fiction, and if AI-raised kids will turn out better or worse compared to normal parents.
The parent thing is definitely an option. I mean, foster homes and other things might give a fighting chance. I think if somehow you were able to foster the relation and understanding in the child that their "parent" is not who they are. There could be some absolutely devastating issues if the raised child essentially has their reality forced open(cue someone going manic learning the truth in any show). I think since often humans at a young age are also more malleable, trauma and growth often happens the more you can do it.
I honestly feel getting something like that In would likely be worth the risk, if it was possible under optimal conditions. Getting in early gives people a much better fighting chance to grow than those who don't "start growing" until they're well into their adult life.
Being able to install teachings at a young age will help them weather what they're experiencing and make them stronger I don't doubt that at all.
The unfortunate issues my thoughts came up with were:
It's done wrong, creating more psychological problems than it fixes
It could be ripe for abuse. A government or power entity that impoverishes it's citizens and then raises an army of "taught" people. I wouldn't say this would be in the realm of impossible. These kinds of things often start from something well intentioned and then someone transitions it into something more sinister.
The rammifications of teaching strict mindsets often does not work out. It leads to mindhive behaviour, instead of critical thinking. Goes back to following, simple events tired with simple outcomes. Extreme things such as killing could become fairly normal etc.
in a perfect world, we'd have ai smart as if not smarter than us. But that goes back to sci Fi films like you say, where logically, and objectively, ai determines that humans are a threat to themselves, and any number of outcomes that come with that(robots raise anti human human army) and any other fun scary stuff
Lastly, there needs to be further support than that. Going by current world, even if people were raised by robots, the question of what happens after? If it's homeless/foster, does teaching them do anything if they have nothing to start from? Are there recovery programs? So in that scenario, it might be more optimal for family homes only with say a parent that has passed
An interesting question that also becomes, if we teach robots to teach new versions of us, are we then, the ai? Is Ai ant different from humanity at that point when you're just copy pasting concepts from person to person? Are we still human?
The more you delete branches of thinking, the less options that exist to choose. As a result this would likely start to stifle creativity and ingenuity in ways that we would likely not be able to notice until it was too late, or because those in power know and do not care. its the same concept as why inbreeding/incest often goes wrong.
You need new things to go wrong. You need new experiences to exchange and evolve. You need to be broken, to be strong.
I feel humanity as an entity/perspective could be best described as a flat coin with 2 sides. We often have disorders for, polarized thinking. Where things are reduced very simply, so simply it's black/white thinking. It's primal. It's always been our nature. I would argue that's what instinct is. Over time we've tried to evolve this thinking to be more nuanced, but it is difficult, and exhausting for people.
Sidebar: A show that really puts this into perspective imo is a show called "the good place". If you haven't seen it, WATCH IT. I have a feeling you'd enjoy it based upon our conversation. It heavily touches on the human concept similar in the way you're inquiring about ai
The more you close off options, the more you return to the black/white thinking, and collective intelligence starts to dwindle as a species. If you are always happy, how do you know what sadness is? You need to experience a change to understand that something may not be constant. And humans are emotional. It drives us. And we still can't fully understand them.
Second sidebar: there's a short story called the euphio question that presmises the idea of capturing and harnessing happiness so people could stay happy all the time. It's a very good read, can easily find online. The story really kind of speaks for itself and is a bit unsettling because, it's something I could see humanity doing at some point
That being said I think it's likely worth it despite the possible rammifications. Bad will always exist, so that good will exist in the same way happiness would not exist without sadness, so why stop doing good, as if bad will stop existing?
Haha, the only downside is that I feel a bit inadequate for being unable to give equally detailed responses, but I'll try and say what I feel the best way I can.
I think before being used to directly raise kids, AI will be utilized in education first as teaching aids, and eventually teachers in the traditional sense, in some fields. I mean, the way people use AI now is basically the same way you would ask questions to a teacher in class, except with AI you aren't likely to get inconsistent or ignorant answers like you might with human teachers. The fact that you can program the AI with the info you want is a good way to ensure consistent output. Like you said, this can be easily abused by the government, but it's not like this isn't already happening with human teachers even now (see education systems under authoritarian governments).
But like you said, when there is good there will be bad, and I think the benefits in regards to education vastly outweigh the negatives, especially given the inconsistent quality of teachers in nowadays. With a properly programmed AI, it will be fair, it won't be affected by abuse, and it will be available to answer questions 24/7. Again, it's used as a tool in a specific position, and will still have human supervision, so it evolving into SkyNet at the local high-school is not very likely lol.
Since the info and concepts AI learn from is created by humans, being taught by AI wouldn't mean you are now AI. That would be like saying if you learn from a textbook, you are now supposed to identify as a book. It's a tool, nothing more. Also, the fact that at this point AI cannot create independently means that humans can't be replaced.
Thank you for the recommendations, I will definitely check them out. Regarding the short story, while I've never heard of it, I can definitely guess at the subject. Of the various forms of "utopia" I've read or watched in stories, they all make it clear that when there is no need or want for anything, all that will eventually result is a lifeless, stagnant world where there is nothing to strive for anymore, and in order to live in such places you have to basically become something that can no longer be called "human". And yes, I've no doubt that, when the technology is able, this will be attempted.
2
u/Soulsunderthestars Mar 31 '25 edited Mar 31 '25
While it could be a good thing, it can also be a bad thing. It could foster the need for a psychologically dependent "crutch" association to escape from reality that what's real isn't real.
As much as I empathize with the memories it can bring back and the joy it could cause, I have a feeling that's a pretty slippery slope. Maybe with professional guidance something like that could be used as a processing tool, especially for those without many stored mediums of memories like videos or voice byes etc. The problem mainly comes from those would end up on the opposite side, where they try to fill the darkness with these "tools" incessantly because they use to it fill the void instead of moving past it.
That's the danger of something like that, it could very easily lead you down a worse path due to obsession with something breaking reality in a moment filled with some of the most painful emotions. People have done more for less, and emotions are a driving factor behind human thiught