this post was submitted on 17 Jul 2024
688 points (99.0% liked)

PC Gaming

8625 readers
1468 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] ZILtoid1991@lemmy.world 9 points 4 months ago

A big letdown for me is, except with some rare cases, those extra AI features useless outside of AI. Some NPUs are straight out DSPs, they could easily run OpenCL code, others are either designed to not be able to handle any normal floating point numbers but only ones designed for machine learning, or CPU extensions that are just even bigger vector multipliers for select datatypes (AMX).

[–] mlg@lemmy.world 8 points 4 months ago

Even DLSS only works great for some types of games.

Although there have been some clever uses of it, lots of games could gain a lot from proper efficiency of the game engine.

War Thunder runs like total crap on even the highest end hardware, yet World of Warships has much more detailed ships and textures running fine off an HDD and older than GTX 7XX graphics.

Meanwhile on Linux, Compiz still runs crazy window effects and 3D cube desktop much better and faster than KDE. It's so good I even recommend it for old devices with any kid of gpu because the hardware acceleration will make your desktop fast and responsive compared to even the lightest windows managers like openbox.

TF2 went from 32 bit to 64 bit and had immediate gains in performance upwards of 50% and almost entirely removing stuttering issues from the game.

Batman Arkham Knight ran on a heavily modified version of Unreal 3 which was insane for the time.

Most modern games and applications really don't need the latest and greatest hardware, they just need to be efficiently programmed which is sometimes almost an art itself. Slapping on "AI" to reduce the work is sort of a lazy solution that will have side effects because you're effectively predicting the output.

[–] RememberTheApollo_@lemmy.world 7 points 4 months ago

When a decent gpu is ~$1k alone, then someone wants you to pay more $ for a feature that offers no tangible benefit, why the hell would they want it? I haven’t bought a PC for over 25 years, I build my own and for family and friends. I’m building another next week for family, and AI isn’t even on the radar. If anything, this one is going to be anti-AI and get a Linux dual-boot as well as sticking with Win10, no way am I subjecting family to that Win11 adware.

[–] JokeDeity@lemm.ee 7 points 4 months ago (1 children)

The other 26% were bots answering.

load more comments (1 replies)
[–] Buelldozer@lemmy.today 6 points 4 months ago* (last edited 4 months ago) (1 children)

I'm fine with NPUs / TPUs (AI-enhancing hardware) being included with systems because it's useful for more than just OS shenanigans and commercial generative AI. Do I want Microsoft CoPilot Recall running on that hardware? No.

However I've bought TPUs for things like Frigate servers and various ML projects. For gamers there's some really cool use cases out there for using local LLMs to generate NPC responses in RPGs. For "Smart Home" enthusiasts things like Home Assistant will be rolling out support for local LLMs later this year to make voice commands more context aware.

So do I want that hardware in there so I can use it MYSELF for other things? Yes, yes I do. You probably will eventually too.

[–] Codilingus@sh.itjust.works 3 points 4 months ago

I wish someone would make software that utilizes things like a M.2 coral TPU, to enhance gameplay like with frame gen, or up scaling for games and videos. Some GPUs are starting to even put M.2 slots on the GPU, if the latency from Mobo M.2 to PCIe GPU would be too slow.

The other 16% do not know what AI is or try to sell it. A combination of both is possible. And likely.

[–] meathorse@lemmy.world 6 points 4 months ago

Bro, just add it to the pile of rubbish over there next to the 3D movies and curved TVs

[–] Xenny@lemmy.world 5 points 4 months ago (1 children)

As with any proprietary hardware on a GPU it all comes down to third party software support and classically if the market isn't there then it's not supported.

[–] Appoxo@lemmy.dbzer0.com 3 points 4 months ago

Assuming theres no catch-on after 3-4 cycles I'd say the tech is either not mature enough, too expensive with too little results or (as you said) theres generally no interest in that.

Maybe it needs a bit of marturing and a re-introduction at a later point.

[–] BlackLaZoR@kbin.run 4 points 4 months ago (1 children)

Unless you're doing music or graphics design there's no usecase. And if you do, you probably have high end GPU anyway

[–] DarkThoughts@fedia.io 3 points 4 months ago (2 children)

I could see use for local text gen, but that apparently eats quite a bit more than what desktop PCs could offer if you want to have some actually good results & speed. Generally though, I'd rather want separate extension cards for this. Making it part of other processors is just going to increase their price, even for those who have no use for it.

load more comments (2 replies)
[–] blazeknave@lemmy.world 3 points 4 months ago

Not even on my phone

load more comments
view more: ‹ prev next ›