536
Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec
(www.businessinsider.com)
This is a most excellent place for technology news and articles.
I don't understand this. Hasn't Intel or Nvidia (or someone else) been making claims about their next CPUs having AI functionality built-in?
Having "AI functionality" doesn't mean they can just get rid of their big/expensive models they use now.
If they are anything like Open AI's LLM, it requires very beefy machines with a ton of expensive RAM.
Well that's exactly what I was thinking when these companies were making these claims... like HOW could they possibly handle this locally on a CPU or GPU when there must be a massive database that (I assume) is constantly being updated? Didn't make sense.
EDIT: this entire website can go fuck off. You ask a simple question about some reasonably new tech, and you get downvoted for having the audacity to learn some new stuff. People on here are downright pathetic.
"AI" doesn't use databases per se, they are trained models built from large amounts of training data.
Some models run fine on small devices (like the model running on phones to make better pictures) but others are huge like Open AI's LLM.
You can record and edit videos on your own devices, but that doesn't mean it's suddenly free for Netflix or YouTube to stream their videos to you.
Surely a local version of Alexa could be developed, but that development would come with its own costs.
Some things simply can't be done locally, such as a web search. Often your route calculations for a map application are also done in the cloud.