this post was submitted on 19 Nov 2024
23 points (100.0% liked)
Technology
37727 readers
704 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Edit: Like always, I was wrong again. :D If I had read the actual post here, then I'd knew this was someone trying to get help for homework.
The user prompts reads like written by Ai. It looks like some system was trying to break the system until it gives nonsense reply (telling to die). The prompt literally tells what to include in the answer, it does not ask:
It tries to force specific answers. I'm almost convinced this was not a honest discussion with the Ai, but trying to break it. Please read the actual chat (linked from the article): https://gemini.google.com/share/6d141b742a13
That was also my guess for what caused it, but I don't think the user was trying to break the system. It looks like they were pasting in questions from their assignment, which would explain the weird formatting, notes about points, and 'listen' tags (alt text copied from an accessibility button?)
Okay, that makes a lot more sense. And you know what, reading the actual post content here (I thought it was an excerpt first, so skipped it) shows you are correct: