this post was submitted on 17 Mar 2024
247 points (100.0% liked)

Games

32663 readers
2171 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here.

founded 1 year ago
MODERATORS
 

How does this KEEP GETTING WORSE??

you are viewing a single comment's thread
view the rest of the comments
[–] barsoap@lemm.ee 11 points 8 months ago (2 children)

We had 4 way split on 20inch tube tvs on hardware that measure their ram in MBs

And were still compute-bound. Things like the N64 pretty much used resources per pixel, mesh data being so light that the whole level could be in the limited RAM at the same time -- and needed to be because there weren't CPU cycles left over to implement asset streaming. Nowadays the only stuff that is in RAM is what you actually see, and with four perspectives, yes, you need four times the VRAM as every player can look at something completely different.

Sure you can write the game to use 1/4th the resources but then you either use that for singleplayer and get bad reviews for bad graphics, or you develop two completely different sets of assets, exploding development costs. I'm sure there also exist shady kitten-drowing marketing fucks who would object on reasons of "but hear me out, let's just sell them four copies instead" but they don't even get to object because production-wise split-screen isn't an option nowadays for games which aren't specifically focussing on that kind of thing. You can't just add it to any random title for a tenner.

[–] Couldbealeotard@lemmy.world 2 points 8 months ago (1 children)

You know, a fun game doesn't have to be built to use all available resources.

[–] barsoap@lemm.ee 2 points 8 months ago (1 children)

I completely agree but I doubt you can afford a StarWars license if you're making an indie game. Needs oomph and AAA to repay itself, and that's before Disney marketing gets their turn to say no because they've seen the walk cycles in Clone Wars and went "no, we can't possibly go worse than that that'd damage the brand". I'm digressing but those walk cycles really are awful.

[–] Couldbealeotard@lemmy.world 2 points 8 months ago

I feel like your comment isn't a reply to what I've said.

[–] isles@lemmy.world 1 points 8 months ago (1 children)

Aren't you also effectively down-resing the 4 screens? You're not running 4x 1080p streams, you're running 540p, textures can be downscaled with no perceptual loss. Non-consoles are already designed to be adaptive to system resources, so I don't see why you need two completely different sets of assets. (IANA dev)

[–] barsoap@lemm.ee 2 points 8 months ago* (last edited 8 months ago)

Textures can be scaled quite easily, at least when talking about 4k to 1k below that could cause GFX people to shout at you because automatic processes are bound to lose the wrong details. Meshes OTOH forget it.

Also I kinda was assuming 4x 1k to throw at a 4k screen as people were talking about home theatre screens. The "uses 4x VRAM" equation is for displaying 4x at 1/4th the resolution as opposed to 1x at 1x the resolution, whatever that resolution may be, and assuming you don't have a second set of close-up assets -- you can't just take LOD assets, being in the distance is a different thing than being in the forefront at a lower resolution: Foreground assets get way more visual attention, it's just how human perception works so you can't get away with auto-decimating and whatnot.