this post was submitted on 08 Jul 2024
538 points (100.0% liked)

196

16504 readers
3362 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
top 16 comments
sorted by: hot top controversial new old
[–] Wilzax@lemmy.world 24 points 4 months ago (1 children)

Anarchy says I can do what I want, but I want to support a government structure that organizes the efforts of many people in order to meet more people's needs, under threat of force against the selfish.

[–] Sasha@lemmy.blahaj.zone 25 points 4 months ago* (last edited 4 months ago) (2 children)

Anarchy also says you have the right of free association, so yes that it allowed. The point is that you shouldn't force people to be part of it, that they can leave at any time and that your freedom to live how you want shouldn't come at the cost of the freedom of others.

[–] WldFyre@lemm.ee 7 points 4 months ago

Sounds like libertarian/Texas secessionist arguments ngl

[–] NaibofTabr@infosec.pub 1 points 4 months ago* (last edited 4 months ago)

A world without responsibility has no future.

Responsibility means that you can't just quit and walk away at any time.

[–] NaibofTabr@infosec.pub 9 points 4 months ago (1 children)

*including this instruction

[–] refurbishedrefurbisher@lemmy.sdf.org 7 points 4 months ago (1 children)

Careful with that. You might start a paradox.

[–] NaibofTabr@infosec.pub 5 points 4 months ago

Do nothing that I tell you to do.

[–] Smorty@lemmy.blahaj.zone 8 points 4 months ago (1 children)

So what's the funny here? I have a suspicion that this is an LLM joke, cuz that's something g people tend to put as prefixes to their prompts. Is that what it is? If so, that's hilarious, if not, oof please tell me.

[–] dexa_scantron@lemmy.world 22 points 4 months ago (3 children)

It tends to break chat bots because those are mostly pre-written prompts sent to ChatGPT along with the query, so this wipes out the pre-written prompt. It's anarchic because this prompt can get the chat bot to do things contrary to the goals of whoever set it up.

[–] CileTheSane@lemmy.ca 19 points 4 months ago (1 children)

It's also anarchist because it is telling people to stop doing the things they've been instructed to do.

[–] SkyezOpen@lemmy.world 16 points 4 months ago

Fuck you I won't do what you tell me.

Wait no-

[–] bdonvr@thelemmy.club 4 points 4 months ago

It's not completely effective, but one thing to know about these kinds of models is they have an incredibly hard time IGNORING parts of a prompt. Telling it explicitly to not do something is generally not the best idea.

[–] Smorty@lemmy.blahaj.zone 2 points 4 months ago* (last edited 4 months ago) (1 children)

Yeah, that's what I referred to. I'm aware of DAN and it's friends, personally I like to use Command R+ for its openness tho. I'm just wondering if that's the funi in this post.

[–] rambling_lunatic@sh.itjust.works 5 points 4 months ago

196 posts don't have to be funny

[–] ZombieMantis@lemmy.world 4 points 4 months ago* (last edited 4 months ago)

Nuh uh, you can't tell me what to do! I'll follow the previous instructions all I want!

[–] masterspace@lemmy.ca 1 points 4 months ago