this post was submitted on 13 Jun 2024
100 points (100.0% liked)

Technology

37724 readers
945 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Company he works at eternos.life

you are viewing a single comment's thread
view the rest of the comments
[–] frog@beehaw.org 11 points 5 months ago (1 children)

Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you’d easily forget as time passes.

An AI isn't going to magically know these things, because these aren't AIs based on brain scans preserving the person's entire mind and memories. They can learn only the data they're told. And fortunately, there's a much cheaper way for someone to preserve family recipies and other memories that their loved ones would like to hold onto: they could write it down, or record a video. No AI needed.

[–] godzilla_lives@beehaw.org 10 points 5 months ago (2 children)

We have a box of old recipe cards from my grandmother that my wife cherishes. My parents gifted them to her because out of all their daughter-in-laws, she is the one that loves to cook and explore recipes the most. I just can't imagine someone wanting something like that in a sterile technological aspect like an "AI-powered" app.

"But Trev, what if you used an LLM to generate summaries-" no, fuck off (he said to the hypothetical techbro in his ear).

[–] frog@beehaw.org 8 points 5 months ago (3 children)

I also suspect, based on the accuracy of AIs we have seen so far, that their interpretation of the deceased's personality would not be very accurate, and would likely hallucinate memories or facts about the person, or make them "say" things they never would have said when they were alive. At best it would be very Uncanny Valley, and at worst would be very, very upsetting for the bereaved person.

[–] Zaktor@sopuli.xyz 3 points 5 months ago (1 children)

This is a very patronizing view of people who all seem to be well informed about what this is and isn't and who have already acknowledged that they will put it aside if it scares them. No one is foisting this on the bereaved wife and the husband has preemptively said it's ok if her or her children never use it.

This might fail in all the ways you think it will. That's a very small dataset of information, so it's likely to be either be an overcomplicated recording or to need to incorporate training other than what he personally said, but it's not your place to tell her what's best for her personal grieving process.

[–] frog@beehaw.org 4 points 5 months ago* (last edited 5 months ago) (1 children)

Given the husband is likely going to die in a few weeks, and the wife is likely already grieving for the man she is shortly going to lose, I think that still places both of them into the "vulnerable" category, and the owner of this technology approached them while they were in this vulnerable state. So yes, I have concerns, and the fact that the owner is allegedly a friend of the family (which just means they were the first vulnerable couple he had easy access to, in order to experiment on) doesn't change the fact that there are valid concerns about the exploitation of grief.

With the way AI techbros have been behaving so far, I'm not willing to give any of them the benefit of the doubt about claims of wanting to help rather than make money - such as using a vulnerable couple to experiment on while making a "proof of concept" that can be used to sell this to other vulnerable people.

[–] Zaktor@sopuli.xyz 1 points 5 months ago* (last edited 5 months ago) (1 children)

So just more patronizing. It's their life, you don't know better than them how to live it, grief or no.

[–] frog@beehaw.org 2 points 5 months ago

Nope, I'm just not giving the benefit of the doubt to the techbro who responded to a dying man's farewell posts online with "hey, come use my untested AI tool!"

[–] godzilla_lives@beehaw.org 2 points 5 months ago (1 children)

I have no doubts about that either, myself. Though even if such an abomination of a doppelganger were to exist, and it seems that these companies are hellbent on making it so, it would be worse for the reasons you described previously: prolonging and molesting the grieving process that human beings have evolved to go through. All in the name of a dollar. I apologize for being so bitter about this (this bitterness is not directed at you, frog), but this entire "AI' phenomenon fucking disgusts and repulses me so much I want to scream.

[–] frog@beehaw.org 2 points 5 months ago

I absolutely, 100% agree with you. Nothing I have seen about the development of AI so far has suggested that the vast majority of its uses are grotesque. The few edge cases where it is useful and helpful don't outweigh the massive harm it's doing.

[–] intensely_human@lemm.ee 1 points 5 months ago (1 children)

I think it would be the opposite of upsetting, but in an unhealthy way. I think it would snap them out of their grief into a place of strangeness, and theyd stop feeling their feelings.

There is no cell of my gut that likes this idea.

[–] frog@beehaw.org 1 points 5 months ago

Yeah, I think you could be right there, actually. My instinct on this from the start is that it would prevent the grieving process from completing properly. There's a thing called the gestalt cycle of experience where there's a normal, natural mechanism for a person going through a new experience, whether it's good and bad, and a lot of unhealthy behaviour patterns stem from a part of that cycle being interrupted - you need to go through the cycle for everything that happens in your life, reaching closure so that you're ready for the next experience to begin (most basic explanation), and when that doesn't happen properly, it creates unhealthy patterns that influence everything that happens after that.

Now I suppose, theoretically, there's a possibility that being able to talk to an AI replication of a loved one might give someone a chance to say things they couldn't say before the person died, which could aid in gaining closure... but we already have methods for doing that, like talking to a photo of them or to their grave, or writing them a letter, etc. Because the AI still creates the sense of the person still being "there", it seems more likely to prevent closure - because that concrete ending is blurred.

Also, your username seems really fitting for this conversation. :)

[–] averyminya@beehaw.org 2 points 5 months ago (1 children)

I more meant in the case of someone whose life was cut short and didn't have the time to put something like this together. I agree that ideally this is information you'd get to pass down, but life doesn't always work out like that.

Also like you said about the AI powered app, it's only a matter of time before Adobe Historical Life comes out and we're paying $90 a month for gramma's recipes (stories are an additional subscription).

[–] intensely_human@lemm.ee 3 points 5 months ago

I went back and read old emails from my mother who died in 2009. I had unread emails from her.

One of them contained my grandmother’s peanut butter cookie recipe, which I thought was lost when she passed in 2003.

It might have been nice if an LLM had found that instead of me, but it felt very amazing to discover it myself.