this post was submitted on 17 Nov 2023
118 points (91.5% liked)

PC Gaming

8533 readers
1309 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Norgur@kbin.social 96 points 11 months ago (6 children)

Thing is: there is always the "next better thing" around the corner. That's what progress is about. The only thing you can do is choose the best available option for you when you need new hardware and be done with it until you need another upgrade.

[–] Sigmatics@lemmy.ca 67 points 11 months ago (3 children)

Exactly. The best time to buy a graphics card is never

[–] wrath_of_grunge@kbin.social 16 points 11 months ago (1 children)

really my rule of thumb has always been when it's a significant upgrade.

for a long time i didn't really upgrade until it was a 4x increase over my old. certain exceptions were occasionally made. nowadays i'm a bit more opportunistic in my upgrades. but i still seek out 'meaningful' upgrades. upgrades that are a decent jump over the old. typically 50% improvement in performance, or upgrades i can get for really cheap.

[–] schmidtster@lemmy.world 12 points 11 months ago (3 children)

4x…? Even in older cards that’s more than 5 years between cards.

A 4080 is only 2.5x as powerful as a 1080ti.

[–] Sigmatics@lemmy.ca 10 points 11 months ago* (last edited 11 months ago) (1 children)

What's wrong with upgrading once every 5-10 years? Not everyone plays the latest games on 4k Ultra

Admittedly 4x is a bit steep, more like 3-4x

[–] schmidtster@lemmy.world 3 points 11 months ago* (last edited 11 months ago) (1 children)

Starfield requires a minimum 1070ti to play. It’s not just about fidelity, you just wouldn’t be able to play any newer games.

load more comments (1 replies)
load more comments (2 replies)
[–] jmcs@discuss.tchncs.de 9 points 11 months ago* (last edited 11 months ago) (1 children)

It depends on what you need. I think usually you can get the best bang for buck by buying the now previous generation when the new one is released.

[–] miketunes@lemmy.world 5 points 11 months ago (4 children)

Yup just picked up a whole PC with rtx3090 for $800.

load more comments (4 replies)
[–] massive_bereavement@kbin.social 9 points 11 months ago (1 children)

Graphics card. Not even once.

load more comments (1 replies)
[–] hydroel@lemmy.world 9 points 11 months ago (1 children)

Yeah it's always that: "I want to buy the new shiny thing! But it's expensive, so I'll wait for a while for its price to come down." You wait for a while, the price comes down, you buy the new shiny thing and then comes out the newest shiny thing.

load more comments (1 replies)
[–] nik282000@lemmy.ca 6 points 11 months ago (1 children)

I bought a 1080 for my last PC build, downloaded the driver installer and ran the setup. There were ads in the setup for the 2k series that had launched the day before. FML

[–] Norgur@kbin.social 8 points 11 months ago

Yep. I bought a 4080 just a few weeks ago. Now there is ads for the refresh all over... Thing is: you card didn't get any worse. You thought the card was a good value proposition for you when you bought it and it hasn't lost any of that.

load more comments (2 replies)
[–] Outtatime@sh.itjust.works 48 points 11 months ago (5 children)

I'm so sick of Nvidia's bullshit. My next system will be AMD just out of spite. That's goes for processors as well

[–] kureta@lemmy.ml 15 points 11 months ago

only thing keeping me is CUDA and there's no replacement for it. I know AMD has I-forgot-what-it's-called but it is not a realistic option for many machine learning tasks.

[–] CaptainEffort@sh.itjust.works 14 points 11 months ago

That’s exactly why I’ve been using AMD for the past 2 years. Fuck Nvidia

[–] dojan@lemmy.world 11 points 11 months ago (1 children)

I went with an AM5 and an Intel Arc GPU. Quite satisfied, the GPU is doing great and didn’t cost an arm and a leg.

[–] Nanomerce@lemmy.world 5 points 11 months ago (1 children)

How is the stability in modern games? I know the drivers are way better now but more samples is always great.

[–] dojan@lemmy.world 6 points 11 months ago

Like, new releases? I don’t really play many new games.

Had Baldur’s Gate III crash once, and that’s the newest title I’ve played.

Other than that I play Final Fantasy XIV, Guild Wars 2, The Sims and Elden Ring, never had any issues.

[–] Vinny_93@lemmy.world 4 points 11 months ago

Considering the price of a 4070 vs the 7800XT, the 4070 makes a lot more sense where I live.

But yes, the way AMD makes their software open to use (FSR, FreeSync) and they put DisplayPort 2.1 on their cards, they create a lot of goodwill for me.

[–] Cagi@lemmy.ca 4 points 11 months ago* (last edited 11 months ago) (1 children)

The only thing giving me pause about ATI cards is their ray tracing is allegedly visibly worse. They say next gen will be much better, but we shall see. I love my current non ray tracing card, an rx590, but she's getting a bit long in the tooth for some games.

[–] limitedduck@awful.systems 15 points 11 months ago (2 children)

ATI

"Now that's a name I've not heard in a long time"

[–] Cagi@lemmy.ca 14 points 11 months ago (1 children)

Not since, oh before most of Lemmy was born. I'm old enough to remember when Nvidia were the anti-monopoly good guys fighting the evil Voodoo stranglehold on the industry. You either die a hero or you live long enough to see yourself become the villain.

[–] PenguinTD@lemmy.ca 4 points 11 months ago

yeah, that's pretty much why I stopped buying Nvidia after GTX 1080. Cuda was bad in terms of their practice, but not that impactful since OpenCL etc can still tune and work properly with similar performance, just software developer/researcher love free support/R&D/money to progress their goal. They are willing to be the minions which I can't ask them to not take the free money. But RTX and then tensor core is where I draw the line, since their patent and implementation will have actual harm in computer graphic and AI research space but I guess it was a bit too late. We are already seeing the results and Nvidia is making banks with that advantage. They are essentially just applying the Intel playbook but doing it slightly different, they don't buy the OEM vendors, they "invest" software developers/researcher to use their closed tech. Now everyone is paying the premium if you buy RTX/AI chips from Nvidia and the capital boom from AI will make the gap hard to close for AMD. After all, R&D requires lots of money.

[–] be_excellent_to_each_other@kbin.social 5 points 11 months ago (1 children)

I have to admit I still tend to call them that, too. Oldttimers I guess.

The first GPU I remember being excited to pop into my computer and run was a Matrox G400 Max. Damn I'm old.

[–] Cagi@lemmy.ca 4 points 11 months ago

I would have been so jealous. Being able to click "3d acceleration" felt so good when I finally upgraded. But I was 12, so my dad was in charge of pc parts. Luckily he was kind of techy, so we got there. Being able to run Jedi Knight: Dark Forces II with max settings is a day I'll never forget for some reason, lol.

[–] GarytheSnail@programming.dev 17 points 11 months ago (1 children)

All three cards are rumored to come with the same memory configuration as their base models...

Sigh.

[–] Fungah@lemmy.world 7 points 11 months ago (1 children)

Give us more fucking vram you dicks.

load more comments (1 replies)
[–] the_q@lemmy.world 16 points 11 months ago (1 children)
[–] zoe@jlai.lu 4 points 11 months ago* (last edited 11 months ago)

just 10-15 years at least, for smartphones\electronics overall too. Process nodes are now harder to reduce, more than ever. holding up to my 12nm ccp phone like there is no tomorrow ..

[–] sederx@programming.dev 15 points 11 months ago

i saw a 4080 on amazon for 1200, shits crazy

[–] gnuplusmatt@reddthat.com 12 points 11 months ago* (last edited 11 months ago) (5 children)

As a Linux gamer, this really wasn't on the cards anyway

[–] BCsven@lemmy.ca 4 points 11 months ago (1 children)

AMD is a better decision, but my nVidia works great with Linux, but I'm on OpenSUSE and nVidia hosts their own OpenSUSE drivers so it works out of the get go once you add the nVidia repo

[–] gnuplusmatt@reddthat.com 3 points 11 months ago (1 children)

I had an nvidia 660 GT back in 2013, it was a pain in the arse being on a leading edge distro, used to break xorg for a couple of months every time there was an xorg release (which admittedly are really rare these days since its in sunset mode). Buying an amd was the best hardware decision, no hassles and I've been on Wayland since Fedora 35.

[–] CeeBee@lemmy.world 3 points 11 months ago (1 children)

A lot has changed in a decade.

load more comments (1 replies)
load more comments (4 replies)
[–] RizzRustbolt@lemmy.world 11 points 11 months ago

freezes

stands there with my credit card in my hand while the cashier stares at me awkwardly

[–] joneskind@beehaw.org 6 points 11 months ago (2 children)

It really is a risky bet to make.

I doubt full price RTX 4080 SUPER upgrade will worth it over a discounted regular RTX 4080.

SUPER upgrades never crossed the +10%

I’d rather wait for the Ti version

load more comments (2 replies)
load more comments
view more: next ›