this post was submitted on 09 Oct 2023
411 points (96.0% liked)

Technology

59135 readers
6622 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AI Industry Struggles to Curb Misuse as Users Exploit Generative AI for Chaos::Artificial intelligence just can't keep up with the human desire to see boobs and 9/11 memes, no matter how strong the guardrails are.

you are viewing a single comment's thread
view the rest of the comments
[–] Hamartiogonic@sopuli.xyz 10 points 1 year ago (1 children)

This is a part of a bigger topic people need to be aware of. As more and more AI is used in public spaces and the internet, people will find creative ways to exploit it.

There will always be ways to make the AI do stuff the owners don’t want it to. You could think of it like the exploits used in speedrunning, but in this case there’s a lot more variety. Just like you can make an AI generate morally questionable material, you could potentially find a way to exploit the AI of a self driving car to do whatever you can think of.

[–] kromem@lemmy.world 3 points 1 year ago* (last edited 1 year ago) (1 children)

This is trivially fixable, it's just at 2-3x the per query cost so it isn't deemed worth it for high volume chatbots given the low impact of jailbreaking.

For anything where jailbreaking would somehow be a safety concern, that cost just needs to be factored in.

[–] Hamartiogonic@sopuli.xyz 1 points 1 year ago

That’s true for all the things that can have a query cost. What about those AI applications that don’t have any financial cost to the user? For instance, The Spiffing Brit continues to find interesting ways to exploit the YouTube Algoritm. I’m sure you can apply that same “hacker mentality” to anything with AI in it.

At the moment, many of those applications are on the web, and that’s exactly where a query costs can be a feasible way to limit the number of experiments you can reasonably run in order to find your favorite exploit. If it’s too expensive, you probably won’t find anything worth exploiting, and that should keep the system relatively safe. However, nowadays more and more AI is finding its way in the real world, which means that those exploits are going to have some very spicy rewards.

Just imagine if the traffic lights were controlled by an AI, and you found an exploit that allowed you to get the green light on demand? Applications like this don’t have any API query costs. You just need to be patient and try all sorts of weird stuff to see how the lights react. Sure, you can’t run a gazillion experiments in an hour, which means that you might not find anything worth exploiting. Since there would be millions of people experimenting with the system simultaneously, surely someone would find an exploit.