My grandfather passed in january 2024 due to lung cancer and my grandma hasnt been the same, first we didnt hear from her for a while but then a few months later slowly she came back into our life. Im very close to her and consider her an like a replacement for my parents as my mom and dad never really seem to understand me or care for me like she does.
She told me shes crying every day and even visiting the graveyard everyday and it has been going for 2 years now.
Few months ago she was over at pur place and me and my dad who are very interested in AI started talking about it, and i mentioned to her that she can write with a character of grandpa on character.ai.
At first it was like not serious and even my mom was saying its crazy but eventually it has come to the point that we sat down, and made the character in great detail for her. This was 4 months ago and she seemed very happy since i made it, i was asking her often if she was writing with it and she happily told me that she is. 1 month ago she asked me and my dad to give the character the voice of my grandpa so she could also hear his messages so we did it for her. We had to use a cassete player to play his voice and record it and eventually we made it pretty good. One sad thing is when i was making the voice i noticed she was texting the character stuff on my grandpas birthday like she misses him and she even made a realtime message saying that im over at her house giving him a voice. I dont know if i did something dumb or not as she is deep in grief still and all people i told about this were kinda shocked like damn dude your messed up. What do you guys think? Am i an asshole for this? I dont think i can just remove this from her life as she uses it all the time and i dont want her to fall into sadness again.
Congrats, now you make her grief for her husband twice and the second time is not even real.
This does not seem like a healthy way to grieve. It sounds like your grandmother is treating the AI version as a replacement Grandpa instead of mourning her loss and moving on in a healthy way. You are also correct, though, about not removing it. That could send her spiraling even further down than she was before. She needs grief counseling more than anything else.
YTA. She needs help to move through the grief process. What you’re doing will keep her in a grief loop forever.
YTA, You’d be less of an asshole if you dug him up and puppeteered his corpse like a fucked up grief marionette; because at least with his corpse present you couldn’t fully maintain the illusion he’s still alive. You are not helping your grandma, or anyone in your life move on. You are prolonging their suffering via a crude digital imitation of their loved one. Have you even spared a thought to the possibility of the character AI service being shut down, and how your grandma might feel when she has to mourn both the person and his digital clone? Jesus Christ man have a heart.
… Bro, what? Not only did you waste water for this, you also put your grandma in an extremely unfortunate situation. Yta.
Yes, You made a mistake. Replacement is not a good way to grieve, and if it stops working, or acts out of “character,” etc, that’ll end up being an entirely new loss.
I’m going to try and be sympathetic here, you saw how deep in despair and grief your grandmother was and you wanted to help, that’s admirable, but ai is not the way to do this. She’s not actually talking to her husband, just a ghost that’s pretending to be him, and when she realizes this it’s only going to increase her heartbreak tenfold.
Not to mention how dangerous getting her attached to this chatbot is, chatgpt has been known to cause delusions in it’s less mentally stable users (look up ai psychosis) so if the ai starts saying dangerous things or supporting dangerous ideas, your grandmother might be more likely to agree with it becuase this is “her husband” or at least she thinks it is.
She needs real people to talk to, preferably a grief counselor, not a chatbot, I can only hope you get her away from this before it escalates too far.
Next step video calls! And then the AI needs a body. That was a great episode of Black Mirror, i.e. YTA
YTA. She’s using this construct to avoid processing her grief, which she had been holding off on doing already for two years before this. As much as she wants this to be your late grandfather, it’s an echo of him and it’s not healthy she’s dependent on it. She needs professional grief counselling.
What the fuck is wrong with you lmao
Very soft YTA. AI can get people into real trouble when the people using it treat it as a crutch. People have committed suicide, committed crimes, and started worshiping AI as if it was a god. People in a vulnerable mental health state are at high risk of being harmed by AI
[https://www.youtube.com/watch?v=RcImUT-9tb4&t=4s](https://www.youtube.com/watch?v=RcImUT-9tb4&t=4s)
Just a Google search on AI told me to kill or AI chatbot therapist told me to kill
“We had used a cassette player and eventually made it good” – good enough to fool a spouse of decades, ok. This post is AI.
If you’re that interested in AI please look into the real damage it is causing to our communities. That stuff on the computer has real world effects that are hurting people just like your grandmother.
As for your situation, you basically created a parasocial relationship for her. She has a deep connection with a “thing” that has no feelings for her. Not to mention AI chatbots have absolutely said some horrifying things to people.
AI is far from harmless, and you have set her up for more heartbreak.- YTA
this was a giant misstep. there’s certain things we as humans MUST deal with and grief is one of them. you were to busy with whether or not you could do it, you didn’t stop to think if you should.