this post was submitted on 20 May 2024
147 points (98.7% liked)

PC Gaming

8521 readers
770 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 11 points 5 months ago (3 children)

I'd kill for a single CCD 16 core x3d part. The 7950x3d is tempting with it's 3d CCD and high clock speed CCD, but since not every game/program knows how to use it properly you end up with hit or miss performance.

[–] ChairmanMeow@programming.dev 15 points 5 months ago (1 children)

Honestly with the 7950x3D being so powerful, you rarely notice it if a game isn't fully utilizing it. I have one and I'm very pleased with it!

[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 5 points 5 months ago (1 children)

My biggest concern from what I've seen is that the weird hack AMD uses to get programs to run on one set of cores vs the other wasn't exactly great last I looked and can cause issues when a game tries to move off of one CCD onto the other. That said I haven't looked into this ever since the CPU first came out so hopefully things are better now.

How observant are you to micro stutters in a game? That was the biggest reason I got the 5800x3d in the first place, but now that I have a better GPU I can tell that thing struggles. And from what I remember most of the issued you'd have moving from CCD to CCD were more micro stutters vs normal frame rate dips or just lower average frame rates.

[–] ChairmanMeow@programming.dev 4 points 5 months ago

I only really notice stutters in heavily modded Minecraft, where it's clearly linked to the garbage collector. In more demanding games I don't notice any stuttering really. Or at least, none that I can't easily link to something triggering in the game that is likely causing it.

Sure, perhaps I have slightly lower average FPS compared to a 7800x3D, but I also use this PC for productivity reasons so there the extra oompf really does help. Still, 97% of a framerate that's already way higher than what my 144Hz monitors support is still well above what my monitors support. I don't think the performance difference is really noticeable, other than in certain benchmarks or if you try really hard to see them.

It's considerably faster than a 5800x3D though.

[–] 30p87@feddit.de 2 points 5 months ago (1 children)

I'm also wondering why there is even a difference in FPS in higher class CPUs - shouldn't it be the GPU bottlenecking, especially in 4k high settings?

1% and 0.1% lows will almost always be CPU bound as it loads more in. Well assuming it’s not vram limiting you. Games are pretty CPU intensive these days since the PS5 and Xbox no longer have potato CPUs. At 120+ fps I regularly see >50% CPU usage in most games. And that’s with nothing running in the background. In the real world you have a ton of background tasks, YouTube videos, discord etc eating your CPU.

Also the 4090 is an absolute beast. My 5800X3D absolutely holds my 4090 back pretty often honestly.