119
Starfield is a “bizarrely worse experience” on Nvidia and Intel, says Digital Foundry
(www.theverge.com)
Welcome to the Starfield community on Lemmy.zip!
Helpful links:
Spoiler policy:
[Spoilers]
to your title if there will be untagged spoilers in the post.Post & comment spoiler syntax:
<spoiler here>
Microsoft owns Bethesda. Microsoft owns Xbox.
Xbox uses AMD GPUs and CPUs.
So the game being optimised for AMD makes absolute sense for Microsoft.
AMD paying for access to optimise for thier PC CPUs and GPUs makes sense for AMD.
However not optimising the game for Intel and Nvidia does not make sense for Microsoft. This is more likely to be an oversight/typical poor AAA game launch than deliberate play to benefit AMD. Other games like Cyberpunk 2077 for example had problems on CPUs/GPUs, we have selection biase here where there are fewer problems on AMD systems, and also a generally reasonably solid launch.
Its frustrating but most of the issues are optimisation, not game breaking. The experience on Intel/Nvidia systems is good, just not as good as it could be. One of the examples in the article was a framerate of 95 FPS vs 105 FPS - that may have been avoidable, but it's a minor annoyance at best. Some of this (not all but some) is just obsessing over minutia and things that won't affect the player experience.
So basically storm in a tea cup, and much of this the usual post launch technical difficulties that will be optimised with patches. This is why people shouldn't buy games at launch, although so far at least we haven't seen the game breaking bugs that have dogged other AAA titles at launch.