this post was submitted on 15 Nov 2024
81 points (100.0% liked)
Futurology
1774 readers
291 users here now
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It all depends on the GPU. If it's something integrated in the CPU it will probably not so better, if it's a 2000$ dedicated GPU with 48GB of VRAM is will be very powerful for Neural Net computing. NPUs are most often implemented as small, low-power, embedded solutions. Their goal is not to compete with data centers or workstations, it's to enable some basic "AI" features on portable devices. E.g: "smart" camera with object recognition to give you alerts.