this post was submitted on 25 Sep 2023
536 points (96.2% liked)

Technology

59653 readers
4381 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he "absolutely" believes that Amazon will soon start charging a subscription fee for Alexa

you are viewing a single comment's thread
view the rest of the comments
[–] TechAnon@lemm.ee 1 points 1 year ago

You just typed that question on one. See: GPT4All You can download many models and run them locally. They were about 5-16GB in size the last time I downloaded one. Pretty slow if you don't have a hefty GPU, but it works!