this post was submitted on 01 Oct 2024
112 points (100.0% liked)

PC Gaming

8568 readers
330 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
top 20 comments
sorted by: hot top controversial new old
[–] sunzu2@thebrainbin.org 22 points 1 month ago (2 children)

Well they do need to make a profit per unit to justify selling them lol

Gaining market share while taking an L is not really viable strategy for HW markets, maybe consoles or selling data moning "smart" devices.

[–] maniii@lemmy.world 5 points 1 month ago (1 children)

I wonder how Microsoft feels about XBox and other things. Gaining marketshare by throwing money at it is an absolutely viable strategy for companies that can do it.

AMD doesnt need to price-cut to compete here. If AMD could make OpenCL and similar projects more successful than CUDA and nVidia, which requires time,money,investment in people,talent, etc etc. then AMD might in 2 or 3 years have a viable alternative to compete.

[–] Auli@lemmy.ca 1 points 1 month ago (1 children)

Except it ha t worked out to well for Microsoft.

[–] maniii@lemmy.world 1 points 1 month ago

At the expense of employee benefits, shareholder value/payouts, public interests, competition, everyone and everything ,except M$ board & C-suite , lost money.

[–] orcrist@lemm.ee 1 points 1 month ago

Tell that to Amazon. If you can build a monopoly, you win in the medium run.

[–] schizo@forum.uncomfortable.business 16 points 1 month ago (5 children)

It's not just price, at least for me.

It's also the fact that FSR is worse than DLSS, that AMF is worse than nvenc, that their raytracing performance is not even close, and that AFMF isn't as good as DLSS frame generation, and that the drivers aren't as stable, and so on and so on and so on....

The whole product is just... not strictly equivalent, and the price difference isn't the reason that I don't really look too hard at AMD cards.

If AMD gets to equivalency with FSR, AMF, and AFMF that'd make their cards FAR more compelling than a $100 lower price tag would.

[–] 30p87@feddit.org 21 points 1 month ago

FSR is still pretty good, especially considering it works on almost any GPU. DLSS does not. By far.

Also, Linux.

[–] Dudewitbow@lemmy.zip 10 points 1 month ago* (last edited 1 month ago)

the problem on the Nvidia front is that vram capacities are hitting the midrange gpus to the point that they may actually lose said features. in particular with the 4070 ti and slower, vram usage gets to the point where the user may not be able to use all features and half to selectively use them because each feature has their own vram cost attached to it.

outside of the 4060 ti 16gb you have to spend 800 to get the 4070 ti super to get 16gb vram

[–] grandma@sh.itjust.works 9 points 1 month ago

I bought an AMD card even though NVIDIAs upscaling is much better. With the added raw performance for the same price, I'm not going to need to rely as much on upscaling. It starts making less and less sense the higher your budget goes though.

[–] vithigar@lemmy.ca 7 points 1 month ago* (last edited 1 month ago) (2 children)

But consider that if you get a more powerful card at the same price you don't need as much upscaling or frame generation. FSR being sightly worse is irrelevant if you can run the game at native.

AMF being worse than NVENC is certainly true, but in my opinion that barely matters. If you care about quality you should use CPU encoding no matter which one you have, and if you just want to capture video locally you can crank up the bitrate where the differences become negligible.

As for ray tracing there's no counter argument there. Nvidia is better, AMD doesn't match them. If you want to do anything with heavy ray tracing AMD is basically a non-starter. Though I do think it's adequate for games with light ray tracing.

[–] schizo@forum.uncomfortable.business 1 points 1 month ago (1 children)

But consider that if you get a more powerful card at the same price you don’t need as much upscaling or frame generation. FSR being sightly worse is irrelevant if you can run the game at native.

I'm on a 3080, and if I'm getting 40fps in a title at settings I'm happy with (which is ending up more common than I'd like), not even a 7900xtx is going to give me the 90fps I'd much prefer. And, lest you think I'm being vastly unfair, I'll also say there are no nVidia cards that will do so either. And yes, this is entirely dependent on your resolution, but the ultrawide I'm quite fond of is essentially the same pixel count as 4k144, which is a lot of pixels to attempt to draw at once.

The only way to get there (at least until the 5090 shows up, I guess?) is to do some sort of upscaling. And, frankly, FSR is - subjectively - not 'slightly worse' but rather such a artifact-y mess (at least in games I'm playing) that I'd rather have 40fps than deal with how ugly it makes everything.

XESS is a lot better, and works fine on AMD cards, but until FSR gets a lot cleaner, or everything starts supporting XESS, DLSS is still the best game in town.

As for NVENC, you're absolutely right, unless you're using it for streaming, and have a hard cap on upper bitrates because you're not Twitch royalty. I'll admit that's an edge case that most people don't have, or even need to consider, but if you do need low-bitrate streaming, and don't want to deal with x264 doing it in software, well, it's NVENC or sub-par quality from AMF. I'm honestly surprised they haven't invested time in fixing the one real use case that hardware encoding still has (real-time encoding of low bitrates), but I suppose someone somewhere has an excel sheet that shows that the market that cares about it is so small as to be of no value to spend time on.

[–] vithigar@lemmy.ca 1 points 1 month ago

If there are no nvidia cards that can run your game at 90fps, not even the 4090, then you're using ray tracing I assume? In which case I've already agreed. The gap is too large, and a product tier offset in AMD pricing isn't going to make up for that gap. My comments about FSR vs DLSS in this scenario assume a superior performance baseline for AMD, where you're comparing no FSR to DLSS "quality", or maybe FSR "quality" to DLSS "performance". AMD would need to tank their prices to an absurd degree to close that gap when ray tracing is involved.

As for why AMD haven't put more time into their encoder, I have a suspicion they were banking on people moving away from AVC to HEVC or, more recently, AV1. Their HEVC and AV1 encoders are much closer in quality to nvidia than their AVC encoder, and clearly have more attention paid to them. Hell, even as far back as Polaris cards AMD's HEVC encoder was even faster than their AVC, while also looking better.

[–] Auli@lemmy.ca 1 points 1 month ago (1 children)

I don’t now AMDs encoding is pretty bad.

[–] vithigar@lemmy.ca 1 points 1 month ago

Their AVC encoding is pretty bad, yes. Simple solution there if you have an AMD card: Don't use it.

If you're streaming use x264, it'll look better than either AMD or Nvidia hardware encoding at streaming bitrates. If you're recording locally use HEVC or AV1 which AMD does much better with than AVC.

[–] hperrin@lemmy.world 1 points 1 month ago

Other than ray tracing, those are all gimmicky. You should buy the card that can run the games you want to play at the resolution you want to play them at. During the RTX 3000 vs RX 5000 generation, AMD had substantially better price to performance for everything except ray tracing. Now, that’s changed, and AMD is a much less appealing deal.

[–] festus@lemmy.ca 12 points 1 month ago (1 children)

The big reason I switched back to Nvidia was because I wanted to play with some local AI models, and doing that with AMD cards was quite difficult at the time (I think it's improved a little, but still isn't straightforward).

[–] BaroqueInMind@lemmy.one 6 points 1 month ago

I've tried to run a few minimal 8B and even 1.3B models on AMD cards and they are such trash that my CPU can run them faster. Why did they code the xformers in python to only be compatible with Nvidia drivers?

[–] MyOpinion@lemm.ee 8 points 1 month ago (1 children)

They are almost always cheaper, but going that low is a a great way to lose money.

[–] helenslunch@feddit.nl 1 points 1 month ago

Is that Your Opinion?

[–] helenslunch@feddit.nl 6 points 1 month ago

I mean, yeah, they could sell them for $1, and take 90% of the marketshare....