A big letdown for me is, except with some rare cases, those extra AI features useless outside of AI. Some NPUs are straight out DSPs, they could easily run OpenCL code, others are either designed to not be able to handle any normal floating point numbers but only ones designed for machine learning, or CPU extensions that are just even bigger vector multipliers for select datatypes (AMX).
PC Gaming
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
Even DLSS only works great for some types of games.
Although there have been some clever uses of it, lots of games could gain a lot from proper efficiency of the game engine.
War Thunder runs like total crap on even the highest end hardware, yet World of Warships has much more detailed ships and textures running fine off an HDD and older than GTX 7XX graphics.
Meanwhile on Linux, Compiz still runs crazy window effects and 3D cube desktop much better and faster than KDE. It's so good I even recommend it for old devices with any kid of gpu because the hardware acceleration will make your desktop fast and responsive compared to even the lightest windows managers like openbox.
TF2 went from 32 bit to 64 bit and had immediate gains in performance upwards of 50% and almost entirely removing stuttering issues from the game.
Batman Arkham Knight ran on a heavily modified version of Unreal 3 which was insane for the time.
Most modern games and applications really don't need the latest and greatest hardware, they just need to be efficiently programmed which is sometimes almost an art itself. Slapping on "AI" to reduce the work is sort of a lazy solution that will have side effects because you're effectively predicting the output.
When a decent gpu is ~$1k alone, then someone wants you to pay more $ for a feature that offers no tangible benefit, why the hell would they want it? I haven’t bought a PC for over 25 years, I build my own and for family and friends. I’m building another next week for family, and AI isn’t even on the radar. If anything, this one is going to be anti-AI and get a Linux dual-boot as well as sticking with Win10, no way am I subjecting family to that Win11 adware.
I'm fine with NPUs / TPUs (AI-enhancing hardware) being included with systems because it's useful for more than just OS shenanigans and commercial generative AI. Do I want Microsoft CoPilot Recall running on that hardware? No.
However I've bought TPUs for things like Frigate servers and various ML projects. For gamers there's some really cool use cases out there for using local LLMs to generate NPC responses in RPGs. For "Smart Home" enthusiasts things like Home Assistant will be rolling out support for local LLMs later this year to make voice commands more context aware.
So do I want that hardware in there so I can use it MYSELF for other things? Yes, yes I do. You probably will eventually too.
I wish someone would make software that utilizes things like a M.2 coral TPU, to enhance gameplay like with frame gen, or up scaling for games and videos. Some GPUs are starting to even put M.2 slots on the GPU, if the latency from Mobo M.2 to PCIe GPU would be too slow.
The other 16% do not know what AI is or try to sell it. A combination of both is possible. And likely.
Bro, just add it to the pile of rubbish over there next to the 3D movies and curved TVs
As with any proprietary hardware on a GPU it all comes down to third party software support and classically if the market isn't there then it's not supported.
Assuming theres no catch-on after 3-4 cycles I'd say the tech is either not mature enough, too expensive with too little results or (as you said) theres generally no interest in that.
Maybe it needs a bit of marturing and a re-introduction at a later point.
Unless you're doing music or graphics design there's no usecase. And if you do, you probably have high end GPU anyway
I could see use for local text gen, but that apparently eats quite a bit more than what desktop PCs could offer if you want to have some actually good results & speed. Generally though, I'd rather want separate extension cards for this. Making it part of other processors is just going to increase their price, even for those who have no use for it.
Not even on my phone