this post was submitted on 19 Oct 2023
113 points (91.9% liked)
Games
32459 readers
2061 users here now
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Weekly Threads:
Rules:
-
Submissions have to be related to games
-
No bigotry or harassment, be civil
-
No excessive self-promotion
-
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
-
Mark Spoilers and NSFW
-
No linking to piracy
More information about the community rules can be found here.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Your point about the screenplay reminds me of one of my biggest pet peeves with armchair commenters on AI these days.
Yeah, if you hop on ChatGPT, use the free version, and just ask it to write a story, you're getting crap. But using that anecdotal experience to extrapolate what the SotA can do in production is a massive mistake.
Do professional writers just sit down at a computer and write out page after page into a final draft?
No. They start with a treatment, build out character arcs, write summaries of scenes, etc. Eventually they have a first draft which goes out to readers and changes are made.
To have an effective generative AI screenplay writer you need to replicate multiple stages and processes.
And you likely wouldn't be using a chat-instruct fine tuned model, but rather individually fine tuned models for each process.
Video game writing is going to move more into writing pipelines for content generation than it is going to be writing final copy. And my guess is that most writers are going to be very happy when they see the results of what that can achieve, as they'll be able to create storytelling experiences that are currently regarded as impossible, like where character choices really matter to outcomes and aren't simply the illusion of choice to prevent fractalizing dialogue trees too much early on.
People are just freaking out thinking the tech is coming to replace them rather than realizing that headcounts are going to remain the same long term but with the technology enhancing their efforts they'll be creating products beyond what they've even imagined.
Like, I really don't think the average person - possibly even the average person in the industry - really has a grasp of what a game like BG3 with the same sized writing staff is going to look like with the generative AI tech available in just about 2-3 years, even if the current LLM baseline doesn't advance at all between now and then.
A world where every NPC feels like a fleshed out dynamic individual with backstory, goals, and relationships. Where stories uniquely evolve with the player. These are things that have previously been technically impossible given resource constraints and attempts to even superficially resemble them ate up significant portions of AAA budgets (i.e. RDR2). And by the end of the next console generation, they will have become as normative as things like ray tracing or voiced lines are today.
That's a win win all around.
They largely are going to remain the same. Specific roles may shift around as specific workloads become obsolete, and you will have a handful of companies chasing quarterly returns at the cost of long term returns by trying to downsize keeping the product the same and reducing headcount.
But most labor is supply constrained not demand constrained, and the only way reduced headcounts would remain the status quo across companies is if all companies reduce headcounts without redirecting improved productivity back into the product.
You think a 7x reduction in texturing labor is going to result in the same amount of assets in game but 1/7th the billable hours?
No, that's not where this is going. Again, a handful of large studios will try to get away with that initially, but as soon as competitors that didn't go the downsizing route are releasing games with scene complexity and variety that puts their products to shame that's going to bounce back.
If the market was up to executives, they'd have a single programmer re-releasing Pong for $79 a pop. But the market is not up to executives, it's up to the people buying the products. And while AI will allow smaller development teams to produce games in line with today's AAA scale products, tomorrow's AAA scale products are not going to be possible with significantly reduced headcounts, as they are definitely not going to be the same scale and scope as today's leading games.
A 10 or even 100 fold increase in worker productivity only means a similar cut in the number of workers as long as the product has hit diminishing returns on productivity investment, and if anything the current state of games development is more dependent on labor resources than ever before, so it doesn't seem we've hit that inflection point or will anytime soon.
Edit: The one and only place I can foresee a significant headcount drop because of AI in game dev is QA. They're screwed in a few years.
I hear this, but then I also think of the "So... what hapenned to all the horses?" question
Their numbers went down. Drastically. That's what hapenned. But that isn't History when it happens to Horses.
Do you think that same result would have happened if horses had other skills outside of the specific skill set that was automated?
If horses happened to be really good at pulling carts AND really good at driving, for instance, might we not instead have even more horses than we did at the turn of the 19th century, just having shifted from pulling carts to driving them?
I'm not sure the inability of horses to adapt to changing industrialization is the best proxy for what's going to happen to humans.
How do you train AI to notice bugs humans notice? Kinda seems like thats the softwares exact weakness, is creating odd edge cases that make sense for the algorithym but not to the human eye
Not really.
One of the big mistakes I see people make in trying to estimate capabilities is thinking of all in one models.
You'll have one model that plays the game in ways that try a wider range of inputs and approaches to reach goals than what humans would produce (similar to the existing research like OpenAI training models to play Minecraft and mine diamonds off a handful of videos with input data and then a lot of YouTube videos).
Then the outputs generated by that model would be passed though another process that looks specifically for things ranging from sequence breaks to clipping. Some of those like sequence breaks aren't even detections that need AI, and depending on just what data is generated by the 'player' AIs, a fair bit of other issues can be similarly detected with dumb approaches. The bugs that would be difficult for an AI to detect would be things like "I threw item A down 45 minutes ago but this NPC just had dialogue thanking me for bringing it back." But even things like this are going to be well within the capabilities of multimodal AI within a few years as long as hardware continues to scale such that it doesn't become cost prohibitive.
The way it's going to start is that 3rd party companies dedicated to QA start feeding their own data and play tests into models to replicate and extend the behaviors, offering synthetic play testing as a cheap additional service to find low hanging fruit and cut down on human tester hours needed, and over time it will shift more and more towards synthetic testing.
You'll still have human play testers around broader quality things like "is this fun" - but the QA that's already being outsourced for bugs is going to almost certainly go the way of AI replacing humans entirely, or just nearly so.
You jest, but yeah, there very likely will be, especially given that there's already full self-driving cars today on roads. The difference will just be that in ~10 years (by the end of the next console generation) that there will be better full self-driving cars on the road.
I dare a self-driving car to drive through a bit of snow
Like this?