this post was submitted on 13 Jun 2024
100 points (100.0% liked)

Technology

37727 readers
960 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Company he works at eternos.life

top 50 comments
sorted by: hot top controversial new old
[–] thingsiplay@beehaw.org 88 points 5 months ago

So it hurts long after his death.

[–] jlow@beehaw.org 86 points 5 months ago (4 children)

https://en.m.wikipedia.org/wiki/Be_Right_Back

Black Mirror is not an instruction manual, people. Quite the opposite. Can we stop trying to make every episode real?

[–] FaceDeer@fedia.io 37 points 5 months ago* (last edited 5 months ago)

If you don't want to do it then don't do it. Can we stop trying to tell everyone else they have to have the same values as you?

[–] stick2urgunz88@lemm.ee 6 points 5 months ago

This was my first thought. How bout let’s not try to recreate the dystopian fictional TV show.

[–] intensely_human@lemm.ee 4 points 5 months ago

We’re not “trying to make every episode real”. Technology’s direction and human foibles are predictable. Black Mirror writers just aren’t blind and have a good sense of what’s coming down the pipeline.

That’s why it’s called Mirror. It’s about showing us who we are.

Sorry if that’s too horrifying for you, but this goes way beyond imitating the last person to mention these problems.

[–] Kolanaki@yiffit.net 4 points 5 months ago* (last edited 5 months ago)

Maybe they were inspired Mulholland Drive instead.

[–] naevaTheRat@lemmy.dbzer0.com 67 points 5 months ago (2 children)

My wife is fortunately still alive so maybe that colours my view. However when I've lost other people the blessed anaesthesia of forgetting has been essential in being able to function.

From the short quote it seems like she maybe has a healthy-ish attitude but idk... I feel like this would be a shallow simulacrum that prolongs grief.

[–] henfredemars@infosec.pub 41 points 5 months ago* (last edited 5 months ago) (6 children)

I don’t believe humans are meant to manage loss in this way — stretching out an imitation of our loved one. As painful as it is, I personally believe humans need to say goodbye. I feel this gets in the way of feeling and truly accepting the loss so that a person can move forward.

Loss is truly heavy, but I do not believe this is better or healthy.

[–] Mycatiskai@lemmy.ca 23 points 5 months ago

My sister has hundreds of YouTube videos she used to help her students learn between music lessons. It will be two years soon since she died, I haven't been able to watch even one.

I like to remember her in my mind, it hurts less than seeing her when she was alive.

[–] naevaTheRat@lemmy.dbzer0.com 11 points 5 months ago

Yeah. I am not a Buddhist but I've always found something rings true in the reflections on impermanence. When we bond with someone we accept the pain of loss, and when we feel it most people seem to describe relief once able to "let go" an accept it being over.

It seems to me that encouraging clinging and reminiscening stunts you a bit and only really provides temporary relief of the loss while drawing out the time it takes to process it.

Idk though, maybe I'll have the misfortune to feel differently some day. It's hard to judge someone hanging out with their spouse watching death creep closer each day. I have approximately zero idea what my opinions would be in the face of that.

[–] thingsiplay@beehaw.org 10 points 5 months ago (1 children)

People who can't get over someone losing will sorrow for the rest of the life, or until they get over it. And AI won't help to get over it. Death is part of our life and as soon as you don't accept it, it becomes pain.

It's last year I think when I read someone created the lost son (or some other family member, I forgot) of a mother, in a VR environment. And she could see him/her again in the VR. Absolutely madness! What does this do to the person? Now couple that with an AI... man the future is grim...

[–] henfredemars@infosec.pub 8 points 5 months ago

I had this conversation with my wife once. I let her know that it is my advance wish that you must allow me to complete the cycle of life. Anything else, any reconstruction of me that technology allows, is to me, an abomination. Keep the pictures, keep the memories, but don’t keep me here when I am gone.

I refrain from judging the decisions of others where possible, but this is my personal wish.

[–] scrubbles@poptalk.scrubbles.tech 8 points 5 months ago

I tried things like character AI to play with talking to "celebrities". It was novel, it was fun. For about 15 minutes. Then... Eh. It's not the person, and your brain knows it's not them. It's always an imitation. I got bored talking with people I've always wanted to talk to.

I can't imagine it being a lived one who has passed. It would feel hollow, empty, and wouldn't make the pain leave. Idk, it just wouldn't be good at all

[–] FaceDeer@fedia.io 5 points 5 months ago (3 children)

I don't believe humans are "meant" to do anything. We are a result of evolution, not intentional design. So I believe humans should do whatever they personally want to do in a situation like this.

If you have a loved one who does this and you don't feel comfortable interacting with their AI version, then don't interact with their AI version. That's on you. But don't belittle them for having preferences different from your own. Different people want different things and deal with death in different ways.

[–] frog@beehaw.org 5 points 5 months ago (1 children)

There may not have been any intentional design, but humans are still meant to eat food, drink water, and breathe oxygen, and going against that won't lead to a good end.

[–] FaceDeer@fedia.io 4 points 5 months ago (1 children)

Even with that, being absolutist about this sort of thing is wrong. People undergoing surgery have spent time on heart/lung machines that breathe for them. People sometimes fast for good reasons, or get IV fluids or nutrients provided to them. You don't see protestors outside of hospitals decrying how humans aren't meant to be kept alive with such things, though, at least not in most cases (as always there are exceptions, the Terri Schiavo case for example).

If I want to create an AI substitute for myself it is not anyone's right to tell me I can't because they don't think I was meant to do that.

[–] frog@beehaw.org 4 points 5 months ago (2 children)

Sure, you should be free to make one. But when you die and an AI company contacts all your grieving friends and family to offer them access to an AI based on you (for a low, low fee!), there are valid questions about whether that will cause them harm rather than help - and grieving people do not always make the most rational decisions. They can very easily be convinced that interacting with AI-you would be good for them, but it actually prolongs their grief and makes them feel worse. Grieving people are vulnerable, and I don't think AI companies should be free to prey on the vulnerable, which is a very, very realistic outcome of this technology. Because that is what companies do.

So I think you need to ask yourself not whether you should have the right to make an AI version of yourself for those who survive your death... but whether you're comfortable with the very likely outcome that an abusive company will use their memories of you to exploit their grief and prolong their suffering. Do you want to do that to people you care about?

[–] Zaktor@sopuli.xyz 3 points 5 months ago* (last edited 5 months ago) (1 children)

This is speculation of corporate action completely divorced from the specifics of this technology and particulars of this story. The result of this could be a simple purchase either of hardware or software to be used as chosen by the person owning it. And the person commissioning it can specify exactly who such a simulacrum is presented to. None of this has to be under the power of the company that builds the simulacrums, and if it is structured that way, then that's the problem that should be rejected or disallowed, not that this particular form of memento exists.

[–] intensely_human@lemm.ee 2 points 5 months ago (1 children)

It could still be a bad idea even if the profit motive isn’t involved.

One might be trying to help with the big surprise stash of heroin they leave to their widow, and she might embrace it fully, but that doesn’t make it a good idea or good for her.

load more comments (1 replies)
load more comments (1 replies)
[–] henfredemars@infosec.pub 4 points 5 months ago* (last edited 5 months ago) (1 children)

Meant, in this context, refers to the conditions that humans have faced over a long period of time and may be more suited to coping with from a survival point of view. I'm an atheist, so I find it strange that you chose to read my comment as highlighting intentional design. Certainly, AI has existed for a much shorter time than the phenomenon on a human encountering the death of a loved one. Indeed, death has been quite a common theme throughout history, and the tools and support available to cope with it and relate to other human experiences far exceed those for coping with the potential issues that come with AI.

I think one can absolutely speak of needs and adaptation for something as common a human experience as death. If you find something belittling about that opinion, I'm not sure how to address you further. I may simply have to be wrong.

[–] frog@beehaw.org 4 points 5 months ago

Just gonna say that I agree with you on this. Humans have evolved over millions of years to emotionally respond to their environment. There's certainly evidence that many of the mental health problems we see today, particularly at the scale we see, is in part due to the fact that we evolved to live in a very different way to our present lifestyles. And that's not about living in cities rather than caves, but more to do with the amount of work we do each day, the availability and accessability of essential resources, the sense of community and connectedness with small social groups, and so on.

We know that death has been a constant of our existence for as long as life has existed, so it logically follows that dealing with death and grief is something we've evolved to do. Namely, we evolved to grieve for a member of our "tribe", and then move on. We can't let go immediately, because we need to be able to maintain relationships across brief separations, but holding on forever to a relationship that can never be continued would make any creature unable to focus on the needs of the present and future.

AI simulacrums of the deceased give the illusion of maintaining the relationship with the deceased. It is certainly well within the possibility that this will prolong the grieving process artificially, when the natural cycle of grieving is to eventually reach a point of acceptance. I don't know for sure that's what would happen... but I would want to be absolutely sure it's not going to cause harm before unleashing this AI on the general public, particularly vulnerable people (which grieving people are.)

Although I say that about all AI, so maybe I'm biased by the ridiculous ideology that new technologies should be tested and regulated before vulnerable people are experimented on.

load more comments (1 replies)
[–] intensely_human@lemm.ee 3 points 5 months ago

Yes. Nothing about this idea sounds like a good idea. Honestly I’m kind of pissed at the dude for saddling his wife with this gift.

[–] I_am_10_squirrels@beehaw.org 4 points 5 months ago (1 children)

One of my colleagues has something along the lines of superior autobiographical recall. He remembers in great detail major and minor events from childhood to today. It's difficult for him to forget.

I myself have forgotten long stretches of my life, and even looking at pictures of myself from those times it feels unfamiliar.

There are some things that I wish I could remember better, but overall I prefer my forgetful brain to his never forget brain.

[–] intensely_human@lemm.ee 2 points 5 months ago

I’ve got that biographical detail and it’s kind of weird being able to remember times with my friends that they can’t remember.

Just feels lonely. Like imagine being the only person who can remember more than an hour ago. How your life would feel different than those living within that 1-hour window.

It’s like that just with a different scale.

[–] OsrsNeedsF2P@lemmy.ml 40 points 5 months ago (1 children)

He posted online, telling his friends it was time to say goodbye. Then his friend called him up, saying he had an opportunity at his company Eternos.Life for Bommer to build an interactive AI version of himself.

It doesn't get more tech bro than that

[–] Zaktor@sopuli.xyz 21 points 5 months ago (2 children)

But in this case it seems like an entirely good thing? The offer was made by an actual friend, the guy himself wanted this, his wife too, and they're both pretty cognizant about what this is and isn't.

[–] averyminya@beehaw.org 6 points 5 months ago (3 children)

Yeah contrary to all the negativity about this in this thread, I think there's a lot of worthwhile reasons for this that aren't centered on fawning over the loss of a love one. Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you'd easily forget as time passes. These are all ways of keeping someone with us without making their death the main focus.

Yes, death and moving on are a part of life, we also always say to keep people alive in our hearts. I think there are plenty of ways to keep people around us alive without having them present, I don't think an AI version of someone is inherently keeping your spirit from continuing on, nor is it inherently keeping your loved one from living in the moment.

Also I can't help but think of the Star Trek computer but with this. When I was young I had a close gaming friend who we lost too soon, he was very much an announcer personality. He would have been perfect for being my voice assistant, and would have thought it to be hilarious.

Anyway, I definitely see plenty of downsides, don't get me wrong. The potential for someone to wallow with this is high. I also think there's quite a few upsides as mentioned -- they aren't ephemeral, but I think it's somewhat fair to pick and choose good memories to pass down to remember. Quite a few old philosophical advents coming to fruition with tech these days.

[–] frog@beehaw.org 11 points 5 months ago (1 children)

Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you’d easily forget as time passes.

An AI isn't going to magically know these things, because these aren't AIs based on brain scans preserving the person's entire mind and memories. They can learn only the data they're told. And fortunately, there's a much cheaper way for someone to preserve family recipies and other memories that their loved ones would like to hold onto: they could write it down, or record a video. No AI needed.

[–] godzilla_lives@beehaw.org 10 points 5 months ago (2 children)

We have a box of old recipe cards from my grandmother that my wife cherishes. My parents gifted them to her because out of all their daughter-in-laws, she is the one that loves to cook and explore recipes the most. I just can't imagine someone wanting something like that in a sterile technological aspect like an "AI-powered" app.

"But Trev, what if you used an LLM to generate summaries-" no, fuck off (he said to the hypothetical techbro in his ear).

[–] frog@beehaw.org 8 points 5 months ago (5 children)

I also suspect, based on the accuracy of AIs we have seen so far, that their interpretation of the deceased's personality would not be very accurate, and would likely hallucinate memories or facts about the person, or make them "say" things they never would have said when they were alive. At best it would be very Uncanny Valley, and at worst would be very, very upsetting for the bereaved person.

[–] Zaktor@sopuli.xyz 3 points 5 months ago (1 children)

This is a very patronizing view of people who all seem to be well informed about what this is and isn't and who have already acknowledged that they will put it aside if it scares them. No one is foisting this on the bereaved wife and the husband has preemptively said it's ok if her or her children never use it.

This might fail in all the ways you think it will. That's a very small dataset of information, so it's likely to be either be an overcomplicated recording or to need to incorporate training other than what he personally said, but it's not your place to tell her what's best for her personal grieving process.

[–] frog@beehaw.org 4 points 5 months ago* (last edited 5 months ago) (2 children)

Given the husband is likely going to die in a few weeks, and the wife is likely already grieving for the man she is shortly going to lose, I think that still places both of them into the "vulnerable" category, and the owner of this technology approached them while they were in this vulnerable state. So yes, I have concerns, and the fact that the owner is allegedly a friend of the family (which just means they were the first vulnerable couple he had easy access to, in order to experiment on) doesn't change the fact that there are valid concerns about the exploitation of grief.

With the way AI techbros have been behaving so far, I'm not willing to give any of them the benefit of the doubt about claims of wanting to help rather than make money - such as using a vulnerable couple to experiment on while making a "proof of concept" that can be used to sell this to other vulnerable people.

load more comments (2 replies)
load more comments (4 replies)
[–] averyminya@beehaw.org 2 points 5 months ago (1 children)

I more meant in the case of someone whose life was cut short and didn't have the time to put something like this together. I agree that ideally this is information you'd get to pass down, but life doesn't always work out like that.

Also like you said about the AI powered app, it's only a matter of time before Adobe Historical Life comes out and we're paying $90 a month for gramma's recipes (stories are an additional subscription).

[–] intensely_human@lemm.ee 3 points 5 months ago

I went back and read old emails from my mother who died in 2009. I had unread emails from her.

One of them contained my grandmother’s peanut butter cookie recipe, which I thought was lost when she passed in 2003.

It might have been nice if an LLM had found that instead of me, but it felt very amazing to discover it myself.

[–] intensely_human@lemm.ee 4 points 5 months ago

Think of how many family recipes could be preserved

We solved this problem long before we invented writing.

LLMs do not enable the keeping of family memories. That’s been going on a long time.

[–] Zaktor@sopuli.xyz 2 points 5 months ago

This is a weirdly "you should only do things the natural way" comment section for a Tech-based community.

Humans also weren't "meant" to be on social media, or recording videos of themselves, or even building shrines or gravesites for their loved ones. They're just practices that have sprung up as technology and culture change. This very well could be an impediment to her moving on without him, but that's her choice to make, and all this appeal to tradition is patronizing and doesn't actually mean tradition is the right path for any given individual. The only right way to process death is:

  • Burn their body and possessions so that no trace remains
  • Pump their body full of chemicals so they won't be decomposing when people ceremonially visit their corpse weeks later
  • Entomb them with their cats, slaves, and riches
  • Plant a tree nourished by their decomposing corpse
  • Turn their ashes into a piece of jewelry to be carried with you always
  • Make a shrine to the dead in your home to be prayed at regularly
  • Cast a death mask to more accurately sculpt their bust
  • Freeze their head so they may be resurrected later
load more comments (1 replies)
[–] padlock4995@lemmy.ml 12 points 5 months ago

Transcendence and Black mirror both ended really well. Keep at it. T200's next.

[–] Dendr0@fedia.io 11 points 5 months ago

The only news I care to hear about people wealthy enough to throw away others' year's salary for trends like this... is if and when they get punted square in the nuts.

So far, it's been slow news on that front.

[–] Marin_Rider@aussie.zone 10 points 5 months ago (1 children)

literally the plot of Caprica

[–] Cethin@lemmy.zip 7 points 5 months ago* (last edited 5 months ago) (1 children)

Sadly, no one knows the plot of Caprica because we're the only two people in the world who watched it. It's impressive how well BSG was received and is remembered and most people don't even know Caprica exists.

[–] Marin_Rider@aussie.zone 6 points 5 months ago

it is a shame. i wished they were able to make another season but i guess just for the 2 of us it wouldnt have made much sense!

[–] JoMiran@lemmy.ml 8 points 5 months ago

It made me think of this old Michael Keaton movie, "My Life", in which he leaves a treasure trove of video tapes to his unborn child.

[–] Psych@lemmy.sdf.org 6 points 5 months ago (1 children)

Guy going full on pantheon .

[–] dubyakay@lemmy.ca 2 points 5 months ago

Ugh the brain "scan" though. I think that's bs. At least in the show, not sure about the short story.

[–] shasta@lemm.ee 4 points 5 months ago

Yeah that seems healthy

[–] autotldr@lemmings.world 2 points 5 months ago

🤖 I'm a bot that provides automatic summaries for articles:

Click here to see the summaryAnd my wife said, 'Hey, one of the things I will miss most is being able to come to you, ask you a question, and you will sit there and calmly explain the world to me,'" he said.

Then his friend called him up, saying he had an opportunity at his company Eternos.Life for Bommer to build an interactive AI version of himself.

You're reading the Consider This newsletter, which unpacks one major news story each day.

AI has access to all sorts of knowledge, but his wife only wants to ask it questions that only Bommer would know the answers to.

Normally, uploading this information would take weeks or months, but Bommer needed to put it together in just a few days.

But when thinking about what questions she might end up asking this tool, once Bommer dies: "I assume perhaps to read me a poem.


Saved 72% of original text.

load more comments
view more: next ›