this post was submitted on 22 Jun 2023
21 points (95.7% liked)

Asklemmy

43404 readers
1310 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
21
Deleted (lemmy.dbzer0.com)
submitted 1 year ago* (last edited 1 year ago) by IsThisLemmyOpen@lemmy.dbzer0.com to c/asklemmy@lemmy.ml
 

Deleted

you are viewing a single comment's thread
view the rest of the comments
[–] wols@lemmy.ml 1 points 1 year ago (1 children)

Well if you actually have free will, how can the machine predict your actions?

What if someone opened box B and showed you what was in it? What would that mean? What would you do?

[–] kthxbye_reddit@feddit.de 1 points 1 year ago

I meant, let’s imagine the machine predicted B and is wrong (because I take A+B). I would call that scenario „I have free will - no determinism.“ Then I will have 1.000.000.000 „only“. That’s a good result.