this post was submitted on 09 Jun 2024
75 points (95.2% liked)

PC Gaming

8576 readers
628 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
 

PowerColor has come up with an interesting use for neural processing units (NPUs) in modern CPUs. At Computex 2024, it displayed so-called "Edge AI" technology that pairs a graphics card with an NPU to lower power consumption in games.

The thing works by linking an AMD graphics card with a neural processing unit via "PowerColor GUI," resulting in rather impressive efficiency gains. The manufacturer claims Edge AI managed to lower power consumption in Cyberpunk 2077 from 263W to 205W, which is a 22% improvement. In Final Fantasy XV, the result was also impressive at 18%.

But it is not just energy efficiency, something hardcore PC gamers often dismiss as irrelevant when you need to push frame rates to the very limits. Visitors confirmed that the use of an NPU by PowerColor's software increased frame rates by 10% when compared to a "dumb" system without a neural processing unit.

top 9 comments
sorted by: hot top controversial new old
[–] lemmylommy@lemmy.world 22 points 5 months ago (1 children)

The thing works by linking an AMD graphics card with a neural processing unit via "PowerColor GUI," resulting in rather impressive efficiency gains.

So, how does it actually work? „Linking“ is too vague to explain anything.

The only thing I can imagine is some sort of upscaling from a lower resolution, which is hardly revolutionary.

[–] tristan@aussie.zone 9 points 5 months ago (1 children)

My guess is similar to Intel XeSS where that's pretty much what it does, runs the game at lower resolution and uses the npu to upscale it in real-time

https://game.intel.com/us/xess-enabled-games/

The biggest difference that this might bring is IF it can work with any game rather than just specific ones

so like FSR 1 but with AI uoscaling. Does sound somewhat exciting

[–] mrfriki@lemmy.world 11 points 5 months ago (1 children)

If this holds true then this is the first actual useful application of IA I’ve ever seen.

[–] KairuByte@lemmy.dbzer0.com 7 points 5 months ago (1 children)

Only if you’ve not paid attention... There have been AI models for identifying object in images/video for use in home automation/security for quite a while, just to name one. AI models that “learn” habits to curb power usage (though admittedly most implementations of this are dogshit.)

There are plenty of legitimately useful applications for AI models, shoehorning an LLM into everything and anything is just the most visible because it’s what every tech company and their brother is doing.

[–] ashok36@lemmy.world 1 points 5 months ago

Yeah, I've been using a Google coral to identify people, cars, animals, etc in my security camera feeds for years now.

[–] mindbleach@sh.itjust.works 3 points 5 months ago

In other words, interpolation. Guesswork. Hallucinated data, statistically correct-ish.

And they're offering this as some tertiary hardware add-on when both Nvidia and AMD have their own "AI" frame-fudging nonsense.

[–] Yokozuna@lemmy.world 3 points 5 months ago

Can't wait to have to buy an NPU as well as my CPU and GPU.

[–] Lojcs@lemm.ee 2 points 5 months ago

That website g r o w s