this post was submitted on 25 Sep 2023
536 points (96.2% liked)

Technology

59674 readers
2952 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he "absolutely" believes that Amazon will soon start charging a subscription fee for Alexa

you are viewing a single comment's thread
view the rest of the comments
[โ€“] Hazdaz@lemmy.world 0 points 1 year ago* (last edited 1 year ago) (1 children)

Well that's exactly what I was thinking when these companies were making these claims... like HOW could they possibly handle this locally on a CPU or GPU when there must be a massive database that (I assume) is constantly being updated? Didn't make sense.

EDIT: this entire website can go fuck off. You ask a simple question about some reasonably new tech, and you get downvoted for having the audacity to learn some new stuff. People on here are downright pathetic.

[โ€“] ours@lemmy.film 4 points 1 year ago

"AI" doesn't use databases per se, they are trained models built from large amounts of training data.

Some models run fine on small devices (like the model running on phones to make better pictures) but others are huge like Open AI's LLM.