this post was submitted on 01 Jan 2024
67 points (91.4% liked)

Games

16728 readers
555 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 1 year ago
MODERATORS
top 30 comments
sorted by: hot top controversial new old
[–] pennomi@lemmy.world 37 points 10 months ago (3 children)

We don’t need faster GPUs as much as we need more VRAM. Double the memory instead of leaving it stagnant again.

[–] BigDaddySlim@lemmy.world 35 points 10 months ago

It's not just the lack of VRAM, it's also Nvidia and their stupidity lowering the bit bus for lower tier cards compared to the last gen counterparts.

[–] 9488fcea02a9@sh.itjust.works 7 points 10 months ago (1 children)

I dont understand the VRAM cuts.... The RAM fabs have been cutting production because of low prices

I would love more VRAM so that i can have a GPU that can do a bit of gaming and dabble in some AI stuff. 100% agree i'd pay for more VRAM instead of horsepower

[–] CaptainProton@lemmy.world 5 points 10 months ago (1 children)

More memory means you can do real work with it, and enterprise AI training is a money printer that they'd be scavenging the shit out of with cards that are closer substitutes.

[–] tal@lemmy.today 1 points 10 months ago

Honestly, the gap between the server parallel compute cards and the home video cards isn't that large. 24GB on video cards, 80GB for a compute card.

That's not even two binary orders of magnitude. That's a narrow window to try to make their money from. Plus, some tasks can be subdivided and run on multiple GPUs, and they can't segment up the market for those.

Like, in general, my bet is that when for most things that fit the above requirements of fitting in that window and having a task that can't be subdivided, there's probably enough room for algorithmic improvements to get two binary orders of magnitude of reduction in memory requirements.

[–] CaptainProton@lemmy.world 5 points 10 months ago

But then you can do work with it, and that's where the real money is at.

They should all be shipping with 32GB now... AMD is at least seeing the light by releasing some 24gb cards under $1k

Really hope Intel's next generation of GPU silicon makes it a more realistic substitute - that would actually spice things up a lot, you basically won't see real competition again until nVidia's AI training dominance is in someone's crosshairs

[–] Evil_Shrubbery@lemm.ee 21 points 10 months ago (1 children)

The 5800 will have a 128bit bussy and 6GB gram. The Ti version will only have it clocked higher and be able to actually address all of the ram.

[–] 0ops@lemm.ee 20 points 10 months ago (1 children)
[–] OrderedChaos@lemmy.world 7 points 10 months ago
[–] miss_brainfart@lemmy.ml 15 points 10 months ago* (last edited 10 months ago) (1 children)

Would it really be unexpected? They've blatantly shown how they want to milk us for every little, incremental improvement that barely qualifies as a sidegrade sometimes.

[–] fuckwit_mcbumcrumble@lemmy.world 2 points 10 months ago (1 children)

What sucks is the 4090 is an amazing GPU. It's priced for how it performs (top dog bar none). It's the lower end cards that are the problem. When a lower end card is a worse value per dollar than the flagship then something is horrendously wrong.

[–] MomoTimeToDie@sh.itjust.works -3 points 10 months ago

This. The 4090 absolutely crushes this generation of cards, hands down. And while irs expensive, it also feels like a "pay more get more" type deal. Even just dropping down to the 4080, it starts to raise questions about the price. It's barely a step over the 3090 and 3080, but gets priced as if it's hitting alongside the 4090

[–] umbrella@lemmy.ml 14 points 10 months ago* (last edited 10 months ago) (1 children)

will we be able to afford it?

[–] Nomecks@lemmy.ca 10 points 10 months ago* (last edited 10 months ago)

Lol no. For AI, research and studio 3d rendering only.

*except Arnold. I don't need render farm neckbeards downvoting me

[–] Zahille7@lemmy.world 11 points 10 months ago (3 children)

Bro I'm still using my 2060 Super

[–] Cavemanfreak@lemm.ee 8 points 10 months ago

And I'm still here on my 1060 6GB rocking 40 fps on medium in Jedi: Survivor.

[–] AstralPath@lemmy.ca 5 points 10 months ago

2060 user here as well. Could use a bit of an upgrade, but not much.

[–] ThugJesus@lemmy.world -5 points 10 months ago (5 children)

I've yet to encounter a game my 2080s couldn't run at 144hz max graphics... Why are we still pumping out $800-1200 gpus when there's nothing requiring that amount of power?

[–] hips_and_nips@lemmy.world 9 points 10 months ago* (last edited 10 months ago)

nothing requiring that amount of power

My simulators, Pimax, and three 4k 144hz monitors beg to differ.

[–] lurker8008@lemmy.world 9 points 10 months ago

Must be nice hitting that 2010s backlog. I still got all many unplayed steam games. And don't even get started with all the free epic games.

[–] MomoTimeToDie@sh.itjust.works 4 points 10 months ago (1 children)

Congratulations on not playing anything intensive, I guess?

[–] fuckwit_mcbumcrumble@lemmy.world 2 points 10 months ago

It's 144hz, not fps. For all we know he could be at like 30fps and it still counts.

[–] Transcendant@lemmy.world 3 points 10 months ago

Yeah bit you've never played Alan Wake 2 with turbo mode ray tracing in 28000p /s

(proud owner of a 2070s still chugging along here)

[–] Geth@lemmy.dbzer0.com 2 points 10 months ago

Recently upgraded both CPU and GPU for better VR performance. Before this my 3070 was struggling a bit with cyberpunk on the 4k TV. There's also productivity and AI stuff people deal with these days.

There's plenty of needs that require more and more power.

[–] yggstyle@lemmy.world 7 points 10 months ago (1 children)

They still haven't dumped all those chips they claimed on their last three earnings calls. Expect the 5000 series to have a familiar flavor.