This graph shows how many times the word ______ has been mentioned throughout the history of the program.
What is up everyone and welcome to the WAN Show!
I was up until 5am two times this week and about 3.30 last night so the energy might
be a little lower.
And highly trained.
But we've got a great show for you guys today!
We're going to be talking about Nvidia's recent assertion that native resolution gaming is
dead.
Long live DLSS and ugh I don't know if I can argue with them about this.
There was a massive Xbox leak revealing plans for consoles, controllers, games and dreams
of acquisitions.
What else we got to talk about this week?
Microsoft announced its co-pilot for the 78th time or something around there I'm not entirely
sure and a video that came out recently that was actually quite fantastic.
The mom and pop computer shop video that came out yesterday?
Was that yesterday?
Wait that's a topic?
We're going to talk about some of the behind the scenes stuff.
Oh fun!
Alright let's go ahead and roll that intro!
The show is brought to you today by Corsair, The Ridge and AG1.
Let's jump right into our first topic.
Yesterday Nvidia officially released DLSS 3.5.
So far it's only supported in Cyberpunk 2077 and Creative Tool Chaos Vantage.
But DLSS 3.5 is kind of cool.
It adds Ray Reconstruction which uses a super computer trained AI network.
So buzzwords.
An AI network then.
Cool.
To generate higher quality pixels in between sampled rays by recognizing lighting patterns
from training data and recreating them during play.
Nvidia claims an 8% frame rate boost in Cyberpunk 2077 over DLSS 3 and also claims that it is
less demanding on the GPU though that has not been quantified.
This is wild though.
In response to a question about whether Nvidia plans to prioritize native resolution performance
in GPUs, Brian Catanzaro, Nvidia's VP of Applied Deep Learning Research stated that
relying on brute power to improve graphical fidelity is a suboptimal solution when Moore's
Law no longer seems to hold.
Instead, the future of gaming graphics will be a greater reliance on AI assisted rendering
and upscaling.
According to Catanzaro, smarter technologies like DLSS need to be implemented to improve
graphics fidelity and circumvent the otherwise low gen on gen performance improvement seen
in today's graphics hardware.
This is a quote.
I think, yep, DLSS AI will eventually be able to replace traditional rendering entirely.
Now I think there's a ton for us to unpack here because first of all, looking at it from
a right now today perspective, I think it's fair to say that most gamers would strongly
prefer to just render at native resolution and have an identical experience every time
the game is rendered accurately no matter what hardware they're on, right?
And it just all comes down to, well, how well optimized is your hardware?
But I would actually, okay, there's, there's a few things that I want to kind of bring
up as sort of a devil's advocate sort of argument, but I mean, do you think that's
fair to say most gamers would rather just game at native res?
Not necessarily.
I think I would be a little bit sketched out if I was a competitive player trying to put
something that's going to potentially like, yeah, it seems pretty good most of the time,
but maybe it doesn't show me the thing that I need to see perfectly the one time I need
to see it.
Right.
Cause you might actually be aiming at a pixel.
Yeah.
So like, uh, I don't know, but if I'm playing single player games, I don't, I don't know
how much I care as long as it looks really good, it's probably fine, you know?
So here's an, and I, I mean, I think we can see from the, not just, sorry, not just aiming
at a pixel either and maybe this kind of ruins the competitive game argument, but also
like pixel perfect jumps, pixel perfect timings, various, you could be competitive in a single
player game that you're trying to speed runs or something like that or something.
Yeah.
Yeah.
Yeah.
Yeah.
All right.
Now here's, here's before you guys, before you guys get too far ahead.
Oh, do we have a problem?
Do you want to figure that out?
Seems like maybe just flipping back.
Not sure.
Okay, cool.
Uh, before we, before we get too far ahead of ourselves here, here's, I at least seems
like the chat is mostly on the same page as, as my sort of perception of the situation.
They would rather not rely on these AI upscaling or, or machine learning solutions, but can
I just make us make a small point in game development?
How often is it that we are seeing hacky solutions that are implemented in order for us to achieve
playable frame rates?
You know, whether it's, um, man, I remember doing this, uh, going down this rabbit hole
into how, uh, reflections in the scope of a rifle are done in traditional gaming versus
a path traced or ray traced gaming, where the kind of hacky solution where they're there,
they basically create, I forget exactly what it was, but, um, the, the way that they could
do that with a minimal performance impact and still have it seem relatively accurate,
just absolutely blew my mind.
Um, what is the, what is the difference there?
That's kind of my question to you.
What's the difference if it's implemented at a driver level by an Nvidia or an AMD versus
if it's implemented game by game in order to optimize the game?
Your thoughts.
Yeah.
It reflects a pre-rendered image is at least one of the techniques for that.
I don't know.
Something that concerns me about a lot of this kind of stuff too is like what we saw
with Starfield.
I know I'm like giving kind of a politician answer, so I'm not answering the question
that you asked.
I'm still going to ask you to answer the question I asked, but sure, go ahead.
Um, but like we, we saw that come out without DLSS, right?
And sure the community was able to respond that time, um, and there was a paid solution
and then there was free solutions and then now, uh, Bethesda has said that they're going
to bring native DLSS to Starfield and that's cool and whatever.
Sounds good.
Um, but if, if the quote that I heard that kind of spooked me a little bit of DLSS slash
AI will eventually be able to replace traditional rendering entirely like, ah, that sounds like
we're going to run into a lot of issues.
Um, I feel like that would, that would bring problems in relation to like abandonware dead
games that you can't play anymore because there's like some weird thing going on with
how it was expected to work at one point in time, but now our current rendering paths
are different.
Um, I also think, uh, that there's a lot of potential for really not cool combat between
the various companies that have their various proprietary solutions that are trying to sponsor
a game and then they try to, you know, line pockets to get them to not actually include
certain solutions.
So it's basically unplayable on whatever graphics card and then like the performance difference
between brands ends up being really dramatic depending on, uh, the company and the partnership
that they had with the GPU company, uh, stuff like, I mean, whether the anti-competitive
stuff happens or not, it's not like we haven't seen similar things in the past.
And I just don't, I don't necessarily know if it's going to be any different than, you
know, your Tress effects and your hair works and your PhysX and all the different things
that we've seen over the years.
Um, and can I just, can I again play devil's advocate again here and say, okay, but what
if the AI assisted version is just legitimately better seeing what they can do with the amount
of die area on a modern GPU that is dedicated to this machine learning processing, um, going
through these, going through these, like these, these trained models, um, if they, if that
advances at a faster rate than just raw GPU grunt, I mean, could we see a game developer
be able to design like this?
You remember this, right?
Nvidia canvas.
Yeah.
Yeah.
It wasn't being done in real time at, you know, 60 to 500 frames per second when they
first showed it off.
But I think it's pretty clear that in the machine learning space, we're seeing a development
or an advancement curve that is much faster than traditional GPU raster rendering.
So with that in mind, is this a better way?
I think it kind of already is because with a 3.0 because we're already, we're already
in a situation where we are, it's, it's clear that 4k is kind of the final frontier for
you to have any hope of high refresh rate gaming.
I just, I don't, I don't foresee, and you know what, I'm probably going to end up eating
these words at some point, but I don't foresee a reason.
I don't foresee any need for us to drive games at significantly higher resolutions
than that natively, because you have so much data to process at 4k that you could upscale
it, not infinitely, but wow.
A lot.
Yeah.
Yeah.
And just the, the, the, just the general improvements that you get from it are something that we've
recommended people use for quite a while now.
So like.
Yeah.
So, okay.
So I see, I see 4k as basically the, the end of that, which is almost a kind of an argument
against my point in that if we're not trying to push resolution higher, well then the obvious
answer is to push visual fidelity higher, but then looking at, um, ah, shoot, yeah.
Starfield.
What, what's that technique called where they, they, they actually scanned real objects in
order to get the texture quality super high.
Sorry, say again?
They scanned real world objects in order?
Yeah.
Uh, I feel like Dan is going to know photo something, um, photogrammetry, photogrammetry.
Yeah.
Yeah.
So if you could, if you could just create all of your in-game assets with photogrammetry
and then just run basically almost like, uh, almost like a, like a mocap sort of equivalent,
like, or, or like a proxy footage equivalent where you're actually rendering with markers
essentially and then your GPU is just interpreting.
It's almost like compressing the entire asset library of the game in a sense, like you kind
of get what I mean.
Right.
Cause we're coming back to this.
That seems even more concerning for multiplayer games.
And like, I sure there's, there's been this, like what about destructible environments
in multiplayer games using that setup?
As long as you, well actually, hold on, that could be a way to make destructible environments
manageable from a performance standpoint, because if you've got this material and you've
assigned characteristics to it, then all you need to do.
So the geometry isn't done that way?
Yeah.
Now you can spend all that compute on geometry and then the, exactly the textures or like
the unique scarring of the surface or whatever.
Yeah.
That might differ from one player to the next, but it'll be probably reasonably close until,
you know, someone discovers that if you wear like a, like an all, an all like, um, blacked
out, you know, suit or something like that, and you happen to stick close to like where
explosions where you can hide or something, you know, but what's the difference, Luke?
People are already exploiting everything anyway.
So it's just, it's just a new thing to do, right?
Like, you know, don't, don't pretend, you know, your hands are totally clean.
I see you and Joe playing Tarkov with your second monitor with your discord, seeing each
other's viewpoint.
Like that's not technically the way it's meant to be played, but realistically everybody
else who's squatted up is doing it anyway.
So okay, fine.
And at a certain point, it's not like unfair.
It's uh, get good.
If you don't, don't hate the player, hate the game.
Yeah.
I just see you dress in black.
Sometimes you'll see engine demos and those engine demos were like, look really sick.
The fidelity will be super high.
Yeah.
But then you, you start kind of diving more into it and you start figuring out like, okay,
uh, creative modeling from the community and this engine is like not going to be a thing
for XYZ reason.
Um, none of these environments are destructible.
Uh, if you, if you look at these environments in different angles, it starts to break down
pretty bad ways, like different things like that.
And I think I'm my concern if we go away from like the, the, just the statement of replacing
traditional rendering keeps getting me concerned about like the technology that we've already
built for traditional rendering, abandoning that and going forward, I feel like is going
to have pitfalls that we don't necessarily understand that I don't understand, um, in
regards to the games that we can necessarily make.
Maybe I'm just completely wrong about this.
That sounds good to me.
Hopefully that's right.
But I mean, I think once we resolve a lot of the copyright and intellectual property
ownership issues around AI generated assets, um, if anything I, and I mean maybe I'm just
coming at this from a lay person's perspective, right?
I don't do game development, but honestly it seems to me a lot easier.
Yeah, probably like if I easier.
Yeah.
But is it, is it as good of a space to be creative in?
That's the thing that I'm concerned about.
I guess it's, it depends what kind of creativity you want.
Like, uh, you know, I think we've, we've talked before about like if I, if I was to, you
know, invest in making a game, you know, what kind of, what kind of game would it be?
And honestly, if, if I was to, if I was to spec out my dream game that I would make, it
would be episodic, even though I know the business model is totally broken and doesn't
work.
Um, it would be, it would be heavily, heavily story focused.
Like I could see, okay, there were these like kind of, uh, kind of trashy, like fantasy
novels that I read when I was like a teenager, uh, by this author, Terry Brooks.
And it's all about this, um, this like fantasy world, Shannara.
Um, and I could, I could honestly, it's all just like adventuring all over this land.
Um, and I could, I could see a game like that where you, instead of hiring game
writers necessarily, you hire writers writers, like, and just create this world
that gradually opens up and with every episode creates more and more environments
to you for you to go and visit essentially an MMO, but not.
You really got to play Baldur's Gate.
Massively multiplayer.
Oh, I know.
I know a hundred percent.
Um, you're, you're really nicely defining Baldur's Gate three.
That's great.
Um, one difference though, is that nothing, I wouldn't intend anything to be
high fidelity.
So I, from my point of view, I had almost zero interest.
Like I was thinking like, um, like RPG maker level of, uh, of, of graphical
fidelity, like it doesn't matter to me.
But if I could, if I could just, you know, draw a stick figure and then say,
okay, yeah, you know, that now, and now it just looks cool.
So you're talking about basically lowering barrier of entry.
Yes.
Don't have to be as much of a, a coder.
Don't have to be as much of a designer or reallocating resources.
Because I think that's something that we've mostly agreed upon for a long time.
That a lot of the time, the, the, the things that games get wrong is they'll
have these really cool technical details or there'll be absolutely, you know,
beautiful, but it's not actually fun, but it's not, it's not fun.
It's not engaging the AAA curse or this, or like the story sucks.
And you just, you know, your sidekick dies and you're like, so you were just a
three dimensional or worse than that.
You're like, oh, thank goodness.
Now they're not going to bug me.
Yeah.
Thank, thank gravy.
I don't have to do another escort quest.
Yeah, exactly.
So, I mean, I don't know.
We've got people in chat saying I want graphical fidelity or I'm not interested.
Yeah, but you can get, you can get that.
Yeah.
Yeah.
That's the thing.
I mean, no part of what he's saying is that you wouldn't get that.
What he's saying is that when he's making it, he doesn't have to worry about that.
He can, he can draw stick figures, he can do whatever.
And then it's going to turn that into something more.
Um, but again, you know, there's all this intellectual
property stuff to, to figure out.
Oh yeah, definitely.
I, I don't know when or how or if that's going to be resolved.
I suspect it will be, but yeah, when, how, or if is, is pretty questionable.
Um, it's an interesting thing because it's, it's cool that more people, more
things are more accessible with it, but then it's also not cool, uh, taking
people's jobs away and also, you know, it's real good at getting something
that's going to be acceptable to a decent amount of people, but it's not
real good at making masterpieces.
No, that's fair.
That's totally fair.
Like, I, I there's, there's this old story of, where is it?
Austria?
Is that where Beethoven's from?
Austria?
I can't remember.
Wherever he's from.
There's, if I remember correctly, it's a, it's a, this could
just be not even a true story.
So great.
Watch it.
Here comes fake, fake story time.
Totally might be, totally might be.
I'm really glad we have a show for this.
Forever ago.
It might've even been a different, uh, composer.
Yeah.
I don't know.
So you don't know who.
You don't know where.
You don't know exactly when you have no idea if it's true.
Why don't we just say, Luke's about to make up a story.
Yeah, yeah, yeah.
Okay.
Tell us a story.
But it was basically every, every school in that country had a grand piano.
Yeah.
Because they're like, you never know when the next version.
In Luke's fictional country.
Okay.
Yes.
Carry on.
Every school had a grand piano because they were, they wanted the opportunity
for if, if another person like this came around, um, that they would be able to
realize it.
Okay, cool.
That's a really neat idea.
Yeah.
If from your made up story about.