logo

The WAN Show

Every Friday, top Tech YouTuber Linus Sebastian and Luke Lafreniere meet to discuss current events in the tech world, a subject from which they do not stray. Hardly ever. Every Friday, top Tech YouTuber Linus Sebastian and Luke Lafreniere meet to discuss current events in the tech world, a subject from which they do not stray. Hardly ever.

Transcribed podcasts: 410
Time transcribed: 31d 6h 22m 24s

This graph shows how many times the word ______ has been mentioned throughout the history of the program.

All right, guys, welcome to the WAN Show.
We've got our special guest, Ryan Schrout,
with us here this week.
And you may or may not have noticed
that he has completely taken the place of that guy
that I have evidently, once and for all, finally fired.
Finally?
It's about time, right?
I know, right?
He's been with me for like three years now.
It's actually long.
Kind of incredible, yeah.
Yeah, I mean, that's too long.
I should just let everyone in my life know
that I've had a relationship with for more than three years.
Like, wife, see ya.
Siblings, see ya.
Eventually kids, right?
Eventually kids.
Sorry, three years, that's it, that's it.
Yeah.
Yeah, my son's gonna make it to three
in the next sort of seven months or so.
Time to hit the road, kiddo.
Get a job where we're going.
All right, guys,
so we've got a bunch of great topics for you.
We are going to,
I'm assuming Ryan's gonna be with me on this,
but maybe he's gonna play devil's advocate
for me over here.
But I am planning to lay into Ubisoft
over the whole 30 FPS is more,
yes, that's the smile I was looking for.
Yeah.
30 FPS, so cinematic.
So we are definitely gonna be tackling that.
We're also gonna be chatting
about NVIDIA's brand new 970M and 980M mobile graphics cards
that are surprisingly interesting.
I mean, usually the mobile graphics card comes out,
it's kind of like, this time around,
there's actually really something to say.
HP is splitting up into two companies
and we've got another really good one here
that I'm scrolling through this list
as fast as I can to find,
because I can't remember what the bloody thing was.
Right, HTC launches two selfie optimized phones.
So we'll have some chat
and we'll definitely do some Twitter interaction
on that one.
Find out what you guys think of selfie phones.
But in the meantime, here's the intro.
If it ever rolls.
Oh, there's no audio again.
I can never remember which one has the audio.
I'll just sing.
Doot, doot, doot, doot, doot, doot, doot, doot, doot,
doot, doot, doot, doot, doot, doot, doot, doot, doot, doot.
You know, it's actually, it's hard to talk
with my voice in my ear with a slight delay.
It's impossible to sing.
So you get yourself back in your ear right now
when you do this?
Yes, I do.
Hold on, let me just thank our show sponsors.
Our first sponsor today is lynda.com.
Visit lynda.com slash wan show
to actually get a pretty super sweet free trial
of their excellent courses.
Kind of a funny story.
We had a stream that we did earlier this week
where someone who actually founded a potential competitor
to some of lynda.com services called them out by name
as an excellent service for learning things,
whether it's programming or digital photography
or whatever else the case may be.
Our second sponsor today,
and it helps if I don't lay them over top of each other
like that, is Phantom Glass.
Their little tagline is the last screen protector
you'll ever need.
And I totally disagree with that
because you actually do need different screen protectors
if you ever change your phone,
but it is the last screen protector you'll ever need
for your current model of phone
because it's made of Gorilla Glass 3,
just like the screen of your phone probably is
if you've got a good quality phone,
it's extremely difficult to scratch,
it's only a phobic and it uses a fantastic
like nano BS thing that they got going on there
to somehow be completely bubble free.
Very cool stuff.
So I think that's pretty much it
for all that me doing nothing but talking constantly.
Why don't we jump right into Assassin's Creed Dev
thinks the industry is dropping the 60 FPS standard.
This was posted on the forum by The Crazed Child
and there's actually, there's a couple of good articles.
There's one from techradar.com,
there's one from GameSpot.
I'm just gonna pop these up here
so you guys can have a look while Ryan gives his thoughts.
Is 30 FPS enough?
Is it more cinematic, Ryan?
No, that's pretty much a crap excuse for
not being able to render at 60 frames per second, obviously.
I mean, if you really wanted to make it cinematic,
you'd just drop it all the way down
to 24 frames per second, right?
Like if you're gonna claim that's the reason,
then go ahead and do it all the way.
Damn it, Ryan, take your logic
and go back to the Midwest or Mideast
or wherever it is you live.
That's pretty close, it's close enough.
I mean, anybody who has actually used a machine
that's locked at 30 and used a machine
that's locked at 60 frames per second
can easily tell the difference.
Like it's, that is not a debate, really, anymore.
And to have a developer of a major game
kind of come out and say,
oh, we don't really think it's important.
What's the exact quote here?
You don't gain that much from 60 frames per second
and it doesn't look like the real thing.
It's a bit like the Hobbit movie.
That's crazy, like.
That's painful.
I mean, okay, the thing about
the whole cinematic look argument is one,
there's motion blur, okay?
And this is something that I think a lot of people
either don't understand or do understand
and hope that other people don't understand.
Because if you wanted, okay,
so motion blur is a natural effect
of the iris of a camera opening and closing,
iris or shutter or whatever it is you wanna call it.
And what happens with games is there is no iris.
There is no camera.
There is no natural motion blur.
Now, the thing is if you have enough GPU horsepower,
you can actually add Faco motion blur
after the fact to the game
if you really want that cinematic look.
But dropping the frame rate isn't the answer
because unless you have motion blur,
you're not getting any of that effect.
And even if you do have some at a lower frame rate,
you're getting, like I said, a cheapo after the fact effect,
not proper motion blur that makes it look properly smooth.
Yeah, and you gotta think like all these TVs
and all these displays are being built
for lower persistence,
which takes away some of that inherent blurring effect
that would be native with some of these displays
and monitors, right?
Or TVs, whatever you happen to be playing it on.
So it doesn't make any sense to me.
Like the whole debate of why is it running at 900p
or 1600 by 900 on both consoles,
and then, oh yeah, it's also gonna run
at 30 frames per second because of it.
60 isn't good for a shooter
or whatever this bull crap they're talking about is.
It just doesn't make any sense.
Like everybody who's a PC gamer
who's looked at this quote or reads this quote,
they all aim for above 60.
And it's not because we're all crazy.
It's because there's actually valid reasons
to want to run at higher frame rates.
So I don't know.
People, and the thing is like,
this game is coming out on the PC.
It's like, are they gonna lock it
at 30 frames per second on the PC?
I bet they don't.
For a better experience?
I bet they don't.
I bet if Nvidia has their say
because it's a GameWorks title,
it will definitely not be locked at 30 frames per second.
Can you imagine Nvidia putting it GameWorks
and like way it's meant to be played
in front of a 30 FPS launch title?
That would be worth controversy and discussion, right?
So, you know, it's limitations of the consoles.
That's why it's going at 30 frames per second.
So speaking of the, okay, speaking of the controversy
and how, I mean, you just sound exasperated
just talking about this topic
because we've been around and around and around again.
Why is Linus making me talk about
this stupid damn non debate
that has nothing to do with reality at all?
And the answer is because Ubisoft keeps on digging.
Why don't they just tell their devs to shut the hell up
and let this thing kind of settle a little bit?
What are they doing?
There are plenty of console games that are locked at 30, okay?
And there are reasons for it
because they don't wanna drop visual fidelity
in order to get to 60 frames per second.
And on a TV and on a console where you're kind of,
you know, you don't wanna have V-Sync issues
or you either have to run at 30 or you have to run at 60
and anything in between that is a really big problem.
So there are games that run at 30 and it's fine
that they're not great experiences
but they also don't come out and say,
well, 30 is really what we targeted because it's not.
Nobody targets 30.
Right.
No, it's absolutely ridiculous.
People are telling me that the audio might be out of sync
which is very, very strange.
Oh, no, I've got other people saying it's just fine.
So I think it might be a, yeah,
I think it might be a bit of an issue.
Apparently there's a very slight delay on source.
So you guys might wanna turn yourselves down
to something else if you're having a little bit of trouble.
So yeah.
If it's just me, let me know.
Yeah, no, I've got a lot of people saying it's fine.
So I'm not gonna can the stream now.
All right, so I guess I'll just see
if there's anything else here that we didn't mention.
Like, yes, I'm extremely disappointed
that the upcoming Assassin's Creed is gonna run at 900p.
Although 900p I find is less of an issue for me
than 30 FPS because frankly,
I didn't notice that much of a difference
with I think it was Battlefield 4.
No, no, I think it was the last Assassin's Creed
that launched at 900p on PS4
then got a patch almost immediately to 1080p.
Didn't really notice the difference compared to 720p
but 30 FPS really is very noticeable
and Ubisoft needs to just stop talking
immediately about this.
On that note, the Halo 2 Anniversary Edition campaign
also will not be running in 1080p.
It's gonna run at 1328 by 1080, 60 FPS.
So Ryan, given the technological constraints here
are that they appear to have to run both games
are both sound and physics engines simultaneously.
Do you think this is an okay compromise
killing some horizontal resolution for the sake of 60 FPS?
So they're taking away horizontal resolution.
I'd like to see what that looks like visually, I guess.
How much of a theater effect
are you getting on either side?
I mean, I don't know, probably.
I mean, so this is the Halo 2 Anniversary, yeah.
I don't know.
I think maybe you give the consumer an option.
So they're rendering both in the background
so you can do what they did with the first one
was where you have that one button push
to swap between them, right?
Which was a really cool effect.
But if it's accounting for that performance issue
then I'd rather just say,
hey, don't bother rendering the other physics,
other graphics, other audio in the background.
Just let me actually run at 19 by 10
because, I mean, it's kind of embarrassing
that the Xbox One won't be able to run Halo 2 at 1080p
regardless of what the reason is.
You would think they would be able to get
that full resolution out of that, so.
You would think that, wouldn't you?
Silly Ryan Trout.
I mean, here I am looking at things
like graphics performance and gigaflops
and realizing that, hey,
maybe you made the wrong decisions on these consoles,
as it turns out.
Maybe you should've spent a little bit more money
on hardware.
And you know, I've gotta wonder
if they spent less money on hardware
because they knew it was gonna be a short life cycle
or if we're gonna end up with a short life cycle
because they realized after the fact
that they didn't spend enough money on hardware.
And then the truly baffling thing about it
is that Microsoft and Sony, at exactly the same time,
spent not enough money on hardware
in pretty much exactly the same way.
Yep.
I guess that's all there is to say, isn't it?
Yep.
Yeah, I mean, they both did it
and you know, you can't blame AMD.
AMD just built what they asked for, right?
But they could have asked for more hardware,
both companies, so.
Yeah.
I think I had something else to say about that,
but I guess that's pretty much all there is to it.
I mean, one thing that they said
was that they were running at 720p
and everything was fine, 720p 60fps.
They wanted to push it further.
So, you know, they managed to get to 1328 by 1080,
but would you rather than running an upscaled resolution
be given the option to run a native resolution
even if it's a lower one?
Yeah, I think so.
I'm curious how you upscale 1328 by 1080.
Like, I don't really understand
how you would be able to do that,
you know, algorithmically, I guess,
because you're simply just cutting off sides
of the window, right?
So if they are rendering at 1328 by 852
or something odd like that,
that was of the same aspect ratio,
then I might believe that they'd be able to upscale it well.
As it is now, I just,
I feel like they're just gonna have black bars on the side,
but surely you wouldn't do that.
No, that's not what they're doing.
They were really clear about that.
It will definitely be some kind of an upscaling.
Huh.
Whatever that ends up looking like.
Hopefully not just zooming in or something either.
I don't know.
It might be worthwhile to do a,
like actually that might be worthwhile
to check out this game, do some capture of it
and have a look at it.
I mean, it'd be convenient
if we actually got a PC version of the game
so that we could really compare it
against something meaningful, but.
Weren't there rumors that that was gonna happen?
Wasn't there? Rumors?
I don't think there's anything confirmed yet,
but Twitch chat is pretty great
about correcting me about these things.
If I ever get anything wrong.
That chat rooms are great for that, aren't they?
Yeah, they really are.
Speaking of the chat,
they've been giving me a really hard time
about launching right into the show
and not explaining who the hell you are
and not explaining where the hell Luke is.
So I'll do both of those things.
Guys, Luke has taken his first decent vacation
pretty much since he started working with me.
So he and I went over to Germany.
I was actually there and back in just over four days
worth of hours, but Luke and Brandon
both decided to stick around.
So we toured the Cherry Tour,
the guys that make mechanical keyboard switches,
and we toured the Sennheiser,
did I say toured the Cherry Tour?
Factory.
And we toured the Sennheiser factory as well.
We actually did, oh man,
we got a really great opportunity to film
the assembly process station by station
of the HD 800 headphones.
And Brandon got some amazing footage of it.
We are gonna bring that to you guys.
Everything from just stamping out diaphragms
all the way to putting it onto artificial ears
and then like an artificial head going on a track
into a noise isolated chamber where it,
yeah, no, it's freaking amazing
where they test every single unit
and then record the response curve.
Really, really cool.
We're gonna bring all that content to you guys,
but I jetted back here and Luke is sticking around
in Germany for a week and a half
and Brandon's with him for a week as well.
So I am on my own for a couple of weeks worth of WAN shows.
So that leads us to this poor substitute for Luke.
This is Ryan Trout from PC Perspective.
Maybe you wanna introduce yourself here
cause I'm clearly not doing a great job for you.
The poor substitute is a owner operator
at PC Perspective, pcper.com.
And basically I review PC hardware for a living
and I have for the last, God, 15 years.
Oh God, competitor, get rid of him.
Yeah, click, hang up, hang up, hang up.
So the website has been around forever.
We used to cover just AMD hardware for,
until like 2004-ish or something.
Yeah, we were AMDMD.com,
focused on like AMD processors and motherboards.
And then-
I didn't even know that.
That's before my time.
I always knew you as PC Perspective.
Yeah, 2004 we launched as PC Perspective.
Basically I graduated college and I said,
all right, if I'm gonna actually make this a job,
let's try to make this a job.
So I've been doing that since then.
And we've got a good team of people that do good work.
And I'm practically family with Linus now.
I mean, we have, we've hula hooped together
and also done the limbo together.
So it's like a bonding thing that we did.
So that's what I do.
We test hardware, graphics cards, processors, SSDs,
all that random crap.
So yeah, I mean, I guess the thing,
the way that I would,
I made that joke about us being competitors before.
We're not really.
I mean, Ryan has a podcast.
I have a podcast.
Ryan makes videos.
I make videos.
Okay, hold on a second here.
No, okay.
The main difference is that Ryan's gonna get more
nuts and bolts into if you actually wanna know
how the SMs within an Nvidia GPU work
and how that impacts the way that it's gonna perform
in this game versus that game
or with this setting dialed up versus that setting up.
Ryan gets really nitty gritty into the details
with those kinds of technologies.
Whereas I see ourselves as more of a general overview style
of content.
So it's just different strokes for different folks.
And the one thing that we definitely know
is that stroking is good.
So let's go.
Yeah, you pretend you don't know
what I'm talking about over there.
Like a monkey.
So speaking of getting into nuts and bolts,
this was originally posted on the Linus Tech Tips Forum
by wordo165.
Thanks wordo, you get a shout out.
Guys, make sure you're posting all the latest awesome news
in the Linus Tech Tips Forum after of course searching
to make sure no one's already posted it.
Cause you will get shout outs on the show.
And there are articles all over the place about this.
Of course, PC perspective.
Have you got, you guys have covered them.
Sorry, I haven't, I've been traveling.
I haven't looked at your site.
Yeah, so.
For which this, this MSI graphics.
Do you have one?
I think I have the one that they're talking about here.
Is this?
Oh no, sorry.
I jumped to the next topic.
980M and 970M.
Oh yeah, yeah, yeah.
I actually have one of those sitting here too.
Okay.
Awesome.
Which one's that?
Oh, you got the, oh that's the MSI.
GT72.
I don't have one of those.
I've got an Aorus X7 Pro
and then I've got the G751 from Asus.
Yep.
So the X7 Pro is one impressive piece of machinery.
Dual 970M's and SLI.
Nice, nice.
This, I mean this is kind of the more traditional
single GPU variant of it.
It's actually, it's a big machine.
It's a 17.3 inch machine.
But it's not very heavy.
Like you can tell it's not overly bogged down by heat sinks.
Yeah, that's the one MSI's using
the magnesium alloy shell on, right?
Oh, okay.
Yeah, I think it is, yeah.
We just got it in a couple of days ago, so.
But it's been pretty impressive performance wise.
So let's talk about it.
So GTX 980M and 970M, Nvidia is touting them.
They had this wonderful graph at the editor's day
and Nvidia is touting them as the closest mobile variants
to their equivalent desktop variants ever.
And would you say from your testing
that that is a valid statement from Nvidia?
I think it is.
I think that graphic they showed
is maybe a little bit skewed just because.
Is it ever not?
Well, exactly, but it's a little bit closer together
on that graph than it actually turns out to be.
I mean, don't get me wrong,
the 980M is still an awesome part,
especially considering its power efficiency
and that's what makes it great in laptops.
But it's, I mean, there's still definitely a gap.
Like the GTX 970 desktop is faster
than the GTX 980M mobile, right?
So of course that means that there's gonna be
a 15, 20, 25% gap between the 980M and the 980 itself.
So, I mean, it's still really good,
but it's not, it's, you know,
their naming scheme still throws it off some.
That, and I think part of the issue for Nvidia
is that whenever they compare anything,
they're gonna be using a reference 970M,
which is a part that doesn't exist.
So the only 970 that I was, or sorry, 970M,
a reference 970, this is so confusing.
The only 970s that I'm aware of are aftermarket ones,
most of which are overclocked.
And the way that GPU Boost 2.0 works
is even if you were to underclock your card
to a reference clock speed,
that card might still overclock itself
to something that a reference card wouldn't have achieved.
So it makes comparing 970M,
which could also be a non-reference design,
because to my knowledge,
board partners or notebook partners
are able to kind of play around with settings there.
So comparing one thing that's non-reference
to another thing that's non-reference,
compared to Nvidia's graph,
which is reference to reference,
I mean, who even knows what's going on?
Yeah, it's interesting,
because if you look through all that information
on the mobile parts, they never give you a TDP.
I saw that.
And the reason is, it's not that they don't have a number,
it is that each notebook,
like the GT72 or the ASUS machine that you have,
like they're allowed to tweak those clock speeds
individually to make sure they're within
the thermal envelope of whatever, you know,
heat sink and fan combination they're using in the laptop.
So there's really not a reference clock speed
for the mobile parts like there is on the desktop parts,
because it is kind of a case by case basis.
So I think you'll see different clocks
throughout the notebooks.
I mean, that was the other thing I noticed
was that they only specify a base clock.
They actually don't have a boost clock in the spec.
It just says base clock is this plus quote unquote boost.
And that's all we really know about it.
Yeah, you gotta open up something like GPU-Z
or something that will give you what it is,
what the firmware of the GPU is actually set at
for its typical boost.
But again, you've been doing this long enough,
the typical boost does not really tell you
what the boost clock is going to be at anyway, right?
It just gives you some minimum that it won't go below
and less of special cases, right?
It's all very complicated.
Yeah, and less of special cases,
in which case it will go below.
Yeah.
And then we don't know really anything
of what's going on, unfortunately.
And then we have that thing called the base clock.
So that's actually what we mean.
So, but I have to say, so we ran a bunch of benchmarks.
The article's not up yet on the website,
but the GT72 with that 980M is at the wall.
I think we're pulling like 200 to 210 watts
with the AC plugged in.
And that's not a lot of power considering using
a quad core hyper-threaded part.
You've got a 980M running at probably 1.1 gigahertz
at its typical type of clock speed that it's running at.
And it's got a 1080p screen and there's basically no game
that maybe except for Crysis 3 at its top settings
that can really make it work overboard to render at 1080p.
So it's actually pretty impressive
and the fans don't get super loud
and it's a pretty good mobile gaming experience
I think so far.
Now, speaking of the mobile gaming experiences,
experience, is this a bit of a strange trend to you?
I've noticed that for all of a sudden,
we were seeing 3K and 4K notebook displays
and we were getting 860Ms or in some cases 870Ms
in the case of stuff like the Aorus X3 or the Razer Blade.
And then all of a sudden we get these super powerful GPUs
in the 970M and the 980M that could really drive
a 3K display properly.
And it seems like everyone all at the same time
met in the back room somewhere and decided to bail
and go back to 1080p.
What the heck happened?
I think the reality is neither the 980M or the 970M
could really push a 3K display.
Okay, not in the Crysis 3, but Shadow of Mordor,
high details, maybe not max?
Maybe, maybe you'd be able to.
And I think that's what, from Nvidia's point of view,
they want to be able to say max out everything
on this laptop, right?
And so 1080p is the right resolution for that.
And again, you're talking about a 17 inch screen
that is a little bit further away from you maybe.
And so 1080p just kind of makes sense.
Now you can hook it up to an external display, which we did.
This has DisplayPort connections on it.
We hooked up that Asus Swift G-Sync monitor and it works.
So if you want to use an external display
for higher resolutions, you can.
But I just, I don't know.
I didn't see the benefit of those
ultra high resolution notebooks really,
because in Windows, it's not very useful
because you've got to turn up the scaling
to a certain amount so that you pick,
so that your icons and your text is actually readable.
And then in games, you know, there's,
when your native resolution is 25 by 14 or 32 by 16
or whatever it was.
Whatever that one works out to.
Yeah, and you downscale to 1080p.
It's not going to look as sharp as if it were running
on a native 1080p screen.
So I'm sure we'll see those.
And something like a machine that has two 980ms in it
would be kind of the perfect candidate for that.
Yeah, a two and a half K display with two 980ms
is probably an excellent sweet spot.
I mean, I guess the thing that, it just baffles me
that we get these underpowered notebooks
with these high resolution displays.
And then we get these overpowered notebooks
with probably the right display,
even though really that's not Nvidia's messaging
about it at all.
I mean, their slide deck is that 900 series M
is suitable for 1440p.
But I think that the actual designs coming out
reflect the true reality of what's going on.
And I mean, you've played around
with dynamic super resolution, correct?
So do you think that dynamic super resolution
is a good balance then,
if you're going to have a 1080p display
and you just take more samples on a notebook?
Yeah, we ran the Dark Souls 2 demonstration portion
or whatever on this MSI laptop using DSR
and it looks great.
Skyrim with DSR looks great.
So if there are games where you have that capability
to run at higher resolution,
you can still do that and take advantage of it
on the 1080p display.
It's kind of, it is kind of a really nice,
almost perfect setup for that, right?
So if you've got more horsepower,
down sample, render at 4K, down sample.
If you don't, then you've got a native 1080p display for it
and it works out pretty well.
So I guess that's pretty much all there is to say
about them.
They're power efficient.
They support all the same features as the desktop ones.
I think the only really disappointing thing about it for me
is that I feel like a lot of the talk about,
oh, you know, the desktop and mobile
are getting much closer, is spin.
Because I really feel like the desktop
could have been further ahead if Nvidia hadn't started,
back with the GTX 680,
started kind of releasing their mid-tier chip
as a flagship and then their true flagship
as the next generation flagship
and now extended this pattern
where we're not getting full-on Maxwell,
but it's being sold to us as a top-tier chip.
So Nvidia is effectively getting two gens
out of each architecture in terms of the numbering scheme.
When we used to get a full new series of cards
with a full new architecture,
or at the very least a die shrink,
each time they said that they were releasing
something new to us.
Yeah, I mean, that's definitely the case, but it's-
I don't think they have a choice.
Yeah, it's a result of the manufacturing process issues
that are there, right?
It's, well, we're stuck on 28 animator, what can we do?
And also keep in mind that Maxwell
is aimed at power efficiency, right?
Like that design was really aimed at
getting the most performance per watt.
And when you can do that by definition, right?
Because of the way physics works, you know,
your GPU on your notebook and on your desktop
are going to compress.
They're going to get a little bit of that, right?
Because that sweet spot for performance
for the 980 is 165 watts,
where for the 980M it's like 120 watts
or whatever it actually is.
So, whereas on the GTX 680,
it was 210 watts versus 110 watts.
So it's getting closer,
but I think it's not nearly as close
as they would like you to believe
based on their fancy new marketing graphs.
I mean, I did find some games
where it was only a 10, 15% Delta,
but these were situations like Tomb Raider
where I really wasn't that limited by the GPU itself.
I was more limited by the CPU
because they were both running the game incredibly well,
even at ultra details.
Whereas in games like Crisis 3,
I was seeing as much as a 40% Delta
between the 970 and the 970M.
Yeah, there's going to be those.
And I think, you know, if you look at just the specs,
the 980M has fewer shaders
than the 970 desktop part does, right?
And the memory clock runs significantly lower,
five gigahertz instead of seven gigahertz.
So there's, fundamentally,
there is going to be a difference there.
So I think the closest analog to the 980M
is probably the 970 desktop.
And it's kind of between the 680,
well, we'll say the 770 and the 970
is kind of where performance sits there.
All right, so let's talk about the new Unreal Engine
bringing eerily realistic skin to your games.
Does eerily realistic skin even sound like something we want?
No, not me personally.
Ethnod posts this on the forum, thanks for that.
And the original article is either from Engadget
or unrealengine.com if you prefer to get it straight
from the horse's mouth.
And basically, Ryan, do you want to explain
what subsurface scattering is exactly
and how that makes things like skin or candles,
or especially any objects that are slightly translucent,
look more realistic in games?
So, I mean, subsurface scattering is not,
I don't think it's a graphics rendering specific term,
but the idea is pretty simple,
where you have a semi-translucent layer.
In this case, your skin, right?
And the light will penetrate one or more layers of the skin
and then bounce out in a different way, right?
So you may have some portion of that light
hitting your skin and bouncing back.
You may have some portion of the light
going a couple of layers deep and bouncing out.
And the result is a different kind of shade or look or style
to how skin actually looks.
And it's something that we're used to seeing every day
when we're sitting here recording a video
and you just look at yourself on the camera
and you see a little bit of sheen
on that part of your forehead there.
That is something that is very hard to render accurately
and do it in a fast, real-time method.
So Unreal Engine, what is it, the 4.5 update?
4.5, yeah.
Yeah, I mean, these guys,
they just do everything awesome, right?
Like the tech they build is impressive as hell.
So a subsurface scattering, basically,
I'm trying to think the first time I remember hearing that,
like some of the,
what was the Nvidia elf pixie character's name?
Oh, shoot, it wasn't Dawn, was it?
Yeah, I think it was Dawn.
Yeah, I think you're right.
And that was like, they were like showing off
subsurface scattering for the very first time.
But now you think about it,
how long has it been since we saw that demo
and now we're finally getting it into an engine
that can operate that technology in a real-time manner?
It's been six or seven years, hasn't it?
Yeah, I mean, there are several iterations of Dawn,
so I can't tell you exactly which one
had that demo to begin with, but yeah.
I mean, the funny thing about it is
it puts us sort of tech podcasters or journalists
or whoever else, it puts us in a really weird position
because Nvidia and AMD for that matter
are both always trumpeting about this stuff,
whether it's VXGI or whether it's physics simulations
interfacing with each other in real time
or whatever the newest, coolest thing they're showing us is
where we kind of sit there and we go, well, shit guys,
it's gonna be eight years before you can actually do
any of this in real time in a game engine
because you look at something like that lunar,
okay, here, before I talk about the lunar lander,
guys in the corner here,
on the left you've got no subsurface scattering,
so his face looks kind of too harsh.
In the middle you've got
a little bit of subsurface scattering,
so the reflections off of his face
and the tone of his skin looks a lot more natural.
And then on the very far right,
you've got an exaggerated effect
where they've cranked it up too much,
kind of like oversaturating your TV
and it just, his face looks like it's made of melted wax.
So I just wanted to show you that to you guys.
So a great example of this would be Nvidia's
lunar lander demo where they've got their VXGI
real time global illumination running
and on two GTX 980s with a static scene that is not moving
and with, what is it, two models of people where we're,
and we don't even have anything that's complicated
to render, they're in suits.
They're not even, they don't even have skin or anything.
And then we've got a ship that's mostly hard edges
and that thing's still chugged,
just moving the camera around a little bit.
I mean, how far are we away from A,
having any hardware that can run this reasonably well
and B, a game dev actually implementing this technology?
It makes it tough to talk about this stuff
when it's actually like futuristic tech,
not really that meaningful for the GTX 980.
No game will ever run on the GTX 980
that uses that technology.
You can quote me on that.
It's been that way really since I have ever covered
graphics cards, right?
From the very first time we saw TNL lighting
and then programmable shaders and geometry shaders
and all this stuff, they always had these amazing demos.
They never really knew what the implementation timeline was.
And, you know, part of that comes back to people
that have this distaste for something like GameWorks.
Nvidia's goal with that is to get that tech into games
as quickly as possible.
Now it means they sacrifice compatibility
and kind of broad industry support for proprietary stuff.
But, you know, like they say that VXGI
is in the current iteration of the Unreal Engine,
but when will you actually see it in a game
is still up in the air.
And then to what degree, right?
There's a whole lot of resolution options for VXGI.
And I mean, the thing about that is to what degree
is really gonna depend on the adoption of cards
that support it.
And if we've got two cards,
each of which cost more than $300,
then we're a long way away from any game dev
actually investing the time and resources
that it would take.
I mean, you look at game devs,
not even porting their existing games to DirectX 12
or not even taking games that are currently in development
and bothering to move them to DirectX 12 or Mantle
or whatever the case may be
where you're gonna have an enormous market,
a built-in user base for these technologies.
And then you take something like VXGI
where it's limited to one or two cards.
I mean, come on.
I mean, having it in a game engine like UE4
will help with that.
And it needs to be one of those things
that you can enable, disable.
It doesn't affect your gameplay per se,
but more about visual style.
We wanna see that kind of stuff in there.
And so I applaud any company,
whether it be a hardware vendor or software vendor
that is willing to stick their neck out
and do more work to push the technology forward, right?
If we only stuck with things that worked on GPUs today,
we would never actually develop the tech
that would allow us to do even newer, better, cooler things
on GPUs next year, right?
So somebody's gotta start at hardware or software.
It used to be at software that pushed it, right?
And now it's kind of the hardware developers saying,
hey guys, try this.
Hey guys, try this.
Speaking of hardware developers,
this topic isn't actually in the document,
but AMD reached out to me.
I've got a pager email from Robert over there in my inbox
saying that they watched the October 4th WAN show.
Yay, people watch my show.
And wanted to address some points I made about FreeSync.
Okay, you acknowledge at 2551
that G-Sync monitors are extremely expensive.
We are hoping that FreeSync will help with this.
You say there is misinformation concerning FreeSync,
but AMD does offer a comprehensive FAQ.
So they had just asked me to go ahead and show you guys
the AMD FAQ that exists about Project FreeSync.
So it's support.amd.com slash en dash us
slash kb dash article slash pages
slash FreeSync dash FAQ dot ASPX.
Really easy to remember, guys.
Just go ahead, key that into your browser right now.
But the main thing that I wanted to clarify
is that I wasn't saying that the misinformation
that's there right now is coming from AMD.
AMD has their FAQ and that's fine and that's great.
What I was saying is that
there's a lot of misinformation out there.
And I think that it's a lot of people repeating
things that they heard earlier on
in the FreeSync development process,
back when we really didn't know what the heck was going on
and AMD wasn't communicating as clearly
as they are right now.
So Ryan, you're pretty familiar with FreeSync, correct?
As familiar as I can be, yeah.
Yeah, well, given that we don't have products
in our hands yet.
But I mean, can you explain what the heck it is exactly
and how it relates to G-Sync?
So the initial, so the way I understand it now,
FreeSync is a name, I don't know if it'll be the final name,
a brand associated with the implementation,
with AMD's specific implementation
of the Visa DisplayPort 1.2,
a adaptive sync technology.
So adaptive sync was adopted by the Visa Foundation
into DisplayPort 1.2a, revision two
or whatever they call it,
as an optional part of the specification.
And all it does is it gives the hardware
and software vendors the ability to understand
that there is a potential for the ability
to withhold a V-blank signal, right?
So you basically can control a monitor's refresh cycle
through a system, through an external system like a PC.
Now, the adaptive sync standard doesn't give you
any information about how you handle special cases,
how you actually implement it,
how it works with your driver,
how it works with Windows and all that other stuff.
So the FreeSync technology, as they're calling it,
is kind of AMD's implementation
of adaptive sync into a product.
So that's one of the biggest misconceptions
out there, you guys.
FreeSync is not built into DisplayPort.
It's not just a matter of any DisplayPort 1.2a display
and any AMD GPU that supports FreeSync,
you plug them together and magic's gonna happen.
In much the same way that NVIDIA's G-Sync
relies on the hardware being implemented
on the monitor in a specific way,
we will have that same process going on with FreeSync.
It will have to actually be a FreeSync-certified display.
Now, AMD's saying that they're not planning
to charge a licensing fee for it, though, correct?
Well, correct, because the DisplayPort,
so the only technology required
by the display scaler vendor is adaptive sync, right?
So that is already part of the standard,
so AMD doesn't really control.
So a monitor will come out that will support adaptive sync.
It doesn't necessarily support FreeSync.
It will be on FreeSync, the AMD technology,
to support adaptive sync monitors, right?
But as it turns out, because Tom was on our show
and talked very openly about the fact
that NVIDIA was not going to support adaptive sync monitors,
you will essentially have FreeSync versus G-Sync
as opposed to what many had hoped,
which was you would have G-Sync
and then this other standard
that both vendors would support.
Which is-
And would eventually just be ubiquitous.
Right, which is not going to be the case anymore.
It's disappointing, I know,
and I expressed that to him several times.
But FreeSync has the capability,
it has the opportunity to be everything
that we love about G-Sync and cheaper.
It has that opportunity.
What they need to do is prove it, right?
NVIDIA's claim has always been that,
hey, it's not easy to do.
If it were super easy to do,
everybody would have done it by now.
And AMD's stance is that we can do it,
it's pretty easy, and we can make it,
we're not going to charge the licensing fee
and we're not going to charge the markup
that you see on all the G-Sync monitors out there,
which is a amicable goal.
I'm totally for that.
Give us variable refresh displays
cheaper than we get them today.
Yeah, I'm all for it.
I have to wonder if they're going to pull it off,
because they did say that they linked me
to AnandTech, where they showed them
a working FreeSync monitor at Computex.
So that's cool.
But Scott had, we had Scott Lawson on last week.
So Scott had expressed some, I guess,
some respect for how complicated G-Sync was.
And he had said that Tom had told him
that they actually, it was a good thing
that they used a programmable chip for the G-Sync module,
because otherwise they would have been doing
a hardware refresh.
That's why it took so long,
from when they showed it to us back in October last year,
to when we actually got monitors,
because it wasn't working.
They had to actually redesign the functionality
of the chip and reprogram it.
Yeah, yeah.
I mean, yes, the same thing is what I've been told as well,
that it's incredibly complicated.
I mean, don't, I mean, clearly you and I
were both at that event in Montreal.
NVIDIA was incredibly excited about the tech.
They would not have waited until August
to release a display if there was any way
they could have not waited until August
to release a display, right?
Yeah, and I mean, NVIDIA's usually pretty tidy
as far as announce, seed samples, get it out the door.
Like their textbook execution, you know,
them and guys like Apple, they do it, they nail it.
And for NVIDIA to announce something and show it
and promise it was gonna be available on a date,
you're usually pretty sure that they're pretty damn sure
it's coming and they were way late on it.
Yeah, and I think it's, I think,
I don't know this for a fact, but I think one of the reasons
why NVIDIA will not support adaptive sync displays
is because to support an adaptive sync display,
your driver basically has to do all of the work in Windows
that the combination of driver and controller
that NVIDIA has on G-Sync will do now.
And if they do that, basically they will have to put
all of the knowledge that they have learned
over the last couple of years building G-Sync
into their software, which will then make it
easily discoverable by AMD, right?
And so that they would get a jumpstart in that way.
But, you know, like I said, I want FreeSync
to be here already.
You know, we've had Richard Huddie and those guys
on our show and they've promised prototypes
at certain times and don't really have them yet.
I was promised one in September and as far as I know,
it's the middle of October now.
Well, let's be fair, it's the 10th of October,
which is the third of October.
It's the first third of October, we'll go with that.
Yeah, October has 31 days.
So until we're, actually no,
if we're a third of the way through today,
then no, it is the second third of October, but not half.
Let's just agree that it's not half.
Okay, we're gonna make these claims out here.
We wanna make sure we're accurate.
So, you know, I want it to be there.
And I think what will inevitably happen
is we'll have to wait until CES.
I think scaler vendors, it's not a super quick process.
You know, when we first heard about FreeSync,
they were talking about,
oh, you'll be able to upgrade some displays
and that's clearly not going to happen, right?
As it turns out, flashing a firmware
is not going to make a monitor a good experience.
You can make it very refresh all you want,
but it will be a good experience.
Right.
And when light is involved in hitting your eyes,
any degradation in that will be noticeable easily.
Which worries me a lot.
I mean, here's something.
Cause okay, it's, well, it's not that easy.
Okay, so let's go back to the old days
when it was pretty easy to figure out
which graphics card delivered the best gaming experience.
You know, you fired up a game,
you recorded your average FPS and you made a bar graph.
And that was the whole story.
Then all of a sudden we're dealing with runt frames,
frames that are displayed for such a short period of time
on your screen that you actually don't even perceive them.
We're dealing with things like stutters
that don't actually show up in an average
or even necessarily a minimum frame rate,
but that are visually very obvious.
We actually illustrated this
in our four way SLI scaling video very recently
where we intentionally froze the frame of our video
very periodically lowering our overall, oh, you saw that?
Yeah.
Oh, okay.
Yeah, so we were, so for the viewers then
we were trying to make a point
that just because you lower your frame rate 1%
or two or 3% doesn't mean that it's only 3%
less visually smooth.
You can see a stutter very clearly.
So that's why, and this was the first time we ever had you
as a guest on the show.
That's why FCAT or actually capturing the output
of the graphics card, analyzing every frame to look
at what the viewer was actually seeing
became very important.
But for something like G-Sync versus FreeSync
how the hell are we gonna qualify, or excuse me,
quantify the smoothness of the experience?
So it will involve basically the same process
we use now for capture, but with cameras, right?
And actually we kind of have the ability right now
to capture variable refresh video through display port.
It's like working through some of these ASIC manufacturers
and it's not working very well.
And it's the post-processing side of it
is actually much more difficult.
Post-processing with a static 60 Hertz video
is really easy.
You know what to count for and what to look for.
When it varies all the time, you really don't know
what you're looking for to even measure against it, right?
So it's gonna take a little bit, I think software
on the tested PC side to output some,
here's what I tried to send results.
And then a post-processing algorithm that says,
well, here's what I actually saw.
And then we get into the issue of,
is Nvidia gonna play well with that?
Is AMD gonna play well with that?
Are they going to fix that system in some way?
It's gonna be really complicated.
And I think at least initially it's gonna be a lot of,
well, I have these two monitors sitting next to me
and they're identical systems.
You know, one's got a 290X and one's got a 980
and I'm playing the same game.
And maybe I have a mouse going into an HID distributor
so that you can play the same game with one mouse
and try to like see the comparisons.
There's gonna be a lot of that crap
where it's very objective.
And because it's objective, everybody's wrong, right?
If I say I like G-Sync better.
Yeah, yeah, sorry.
Because it's subjective, everybody's gonna be wrong.
Because everybody will have a different opinion.
So basically the hardware review industry
of which you've been a part for 15 years
is going full circle from like,
yeah, dude, I got this graphics card.
It gets like great FPS and my games run super smooth
to getting it really down to a science
to where we could really figure out which one was better.
And then we're going all the way back to,
yeah, dude, I got these two graphics cards.
I'm gonna play with them both side by side.
I'll let you know which one's better.
Yeah, I think it will be like that initially
and that sucks, but we'll fix it.
We'll figure it out.
Like it will be an industry problem that will be solved
because there's too much money and too much pride
in both of these companies to kind of just let it sit there
and be determined by our subjective eyeballs.
By staring at it really hard.
Yeah, and nothing good happens.
Once you've stared at a monitor for three hours straight,
tend to figure out which one is better,
nothing good will happen.
So it needs to be something.
Yeah, I had someone tweet at me the other day
about how easy my job is.
And I was kind of, I was sitting there last night
at four in the morning,
benchmarking these new mobile graphics cards,
just kind of going, dude, you have no idea.
My wife to this day still believes that I sit at the office
and play video games all day.
And I'm like, ah.
No, I'm benchmarking them, hun.
When you play the same 60 to 90 second portion of Skyrim
for the, I think 40,000th time,
it's not really fun anymore.
You know, some benchmarks are okay.
I don't mind our Tomb Raider run,
cause it's got some cinematic bits
where you kind of watch some stuff at the beginning,
then it's got like a slow-mo thing
where you like pop two guys,
then you go into a burning building,
you jump across a chasm,
you like silent kill some guy,
you jump up and then you kill two more guys.
Like it's actually a pretty engaging benchmark,
but our Shadow of Mordor benchmark,
when I was running that last night,
Tomb Raider, the two minutes flies by.
Shadow of Mordor,
the only way we got it consistent enough,
cause we'll do five or 10 runs once we've locked it down
and we'll be looking for about 0.2 to 0.5 FPS difference
on the same hardware in order to decide,
okay, this runs okay for our error tolerance.
So Shadow of Mordor, the way we got that done,
because it's got dynamic weather,
it's got dynamic, you know, roving bands.
Personnel locations, yeah. Yep, bad dudes.
And all this stuff, it was really hard.
So we ended up finding an instance
that always has the same weather
and then always has the same baddies
that will mostly come and attack you all at the same time
in pretty much the same way and they'll circle you.
So the way the benchmark works
in order to make it equally visually demanding
is we attract all the baddies
and then block for two minutes
with some nice scenery in the background.
So we keep our character stationary
and our camera as stationary as possible
and block for two minutes
and that's how we got the consistency.
I swear, sometimes I would miss a block
because I was half asleep, half asleep at the wheel there.
Yeah, that's, I could see how that would be pretty bad.
Yeah.
You know, at three in the morning, it's like,
oh, I have to block this orc again, crap.
Just get like a, remember when they had the controllers,
the turbo buttons where you can just hold it down,
maybe you can do that.
No, because I have to reposition him.
I actually have to be paying close attention to the block
so I keep him in the same location
so I'm rendering basically the same scene.
Yeah, all right, that's pretty bad.
Yeah, and the funny, the stupid thing
about Shadow of Mordor is,
and I'm sure you're with me on this,
how much do you wish that video game makers
would build in benchmarks into their games?
Shadow of Mordor does have a benchmark in it.
It does, but here's a problem.
Shadow of Mordor is capped at 100 FPS
and the benchmark runs at whatever frame rate it wants.
So you could get a max FPS value of 280 frames per second
because I think it's actually recording
when the screen is black
and the game will never run that way.
So you're not getting a meaningful result.
Yep, yep, you're right.
I didn't see that issue
because we were using the capture stuff
so we didn't capture the black screen stuff.
So yeah, okay, I see what you're saying.
Yep, good point.
That and from what I've seen,
the Shadow of Mordor benchmark
is really not that representative of in-game performance
anyway because you don't get that close to any of the models
and you don't really have a ton of them on screen at a time.
So when we do our blocking combat thing,
we actually get 10 to 20% lower than that fly through
which has fires and all these things
that should be demanding.
And then when you have adverse weather conditions,
you actually drop another 20% off the performance.
So that in-game benchmark, which is always great weather
and never gets too close to anything,
it's about 30 to 40% out of the performance
that people could actually expect to get in the game.
I think it does rain in the benchmark
but it's not the most demanding
instance of rain in the game.
When the weather's really heavy, it tanks it.
So that was really frustrating for me
because I kind of went,
well, this isn't that usable actually.
Yep, yeah, I think what would be better is,
well, I mean, in your instance there, it's all dynamic.
So being able to record and replay USB input
wouldn't really help you in that instance.
No, no, it wouldn't, unfortunately.
Tough life we have, sorry.
Yeah, I know, playing video games for a living.
Or so everyone thinks.
That's what they think, it's all right.
So here's an interesting little piece of news.
Now we all know the iPhone 6,
which by the way, I have finally received one of.
For those of you who were expecting me to do a review,
here's my iPhone 6 that I have obviously totally set up.
See, it got completely all the stock icons
and nothing else on it.
So I'm not started yet.
Finally got my iPhone 6.
I live dangerously, got it in my back pocket.
Ooh, I'm like a badass over here.
So we originally thought there was a viral video
that Marcus Brownlee released
where he was showing off the sapphire glass
that was supposedly gonna be on the new iPhone 6
and iPhone 6 Plus.
We originally thought we were gonna get a sapphire glass,
complete display cover.
Turns out we didn't get that at all.
And GT Advanced Technologies, Apple's sapphire glass supplier
has just filed for bankruptcy.
So this was posted by QwertyWarrior on the forum.
The original article was from Next Power Up.
And basically last year they entered into a big deal,
over $500 million with Apple to supply sapphire glass.
They're meant to start actually supplying it in 2015.
And they are now down to 85 million in cash
and looking for additional financing
to continue their operations.
So Ryan, how much of a blow is this
to the adoption of sapphire glass?
Because we all know that Apple really leads the charge
on materials technology in a way
that no one else seems to be willing to do.
The issue with Apple is they like to own the materials
that they use, or at least kind of be
the almost exclusive buyer from those companies, right?
So I'm interested, so like the phone doesn't use sapphire,
but the watch uses sapphire, right?
So is this indication that they don't expect
the watch to sell as much as maybe they had originally?
Or you believe that they were gonna build
the display out of sapphire and change their mind
kind of at the last minute, I guess, on the phones?
I do wonder.
Yeah, I mean, that seems likely, right?
You have a company who has this multi-million dollar deal
with multiple years through Apple,
and it kind of like all falls through.
Clearly they were doing some experimenting
and it didn't turn out, but.
Yeah, and I mean, someone the size of Corning,
I think even an Apple PO
probably doesn't scare them that much.
So if Apple turns around and goes,
okay, yeah, we need one bazillion orders
of Gorilla Glass 3, I think Corning
can probably turn that around for them okay.
So that really is what it looks like.
Those samples that were floating around
may have very well been samples
that no deal ever got done on.
And how disappointing is it then to see
that sapphire glass is still probably
not gonna be the standard for a while?
You know, I go through a lot of phones.
I don't really scratch them very often
and I don't put detectors on them usually.
I do sometimes, but I was never really sure
that it made sense based on the cost difference
to have a sapphire screen on a phone.
For my watch, it makes total sense
because I slam that into every wall and doorframe
and yeah, and everything, right?
Like it gets beat up.
I have a Samsung Gear Live
and I don't think it has a sapphire screen
and I think you can tell that that is the case, right?
So.
My pebble steel is still doing okay.
Does that one have a Gorilla Glass screen at least?
I don't know actually.
It's actually, I will admit looking at it now
in these lights that there aren't really
any major scratches, which is maybe impressive.
Maybe it does have something in there good.
But I mean, even if it has like Gorilla Glass 3,
that's still a pretty good product for scratch resistance.
So it just, you know, it makes more sense.
It's hard to manufacture, sapphire is.
It's easier to, much, much easier to manufacture
in a small form factor like this
than it is in a 5.5 inch, you know,
iPhone 6 Plus type form factor.
So I think we'll get there eventually, right?
Like they'll, somebody will have some manufacturing
breakthrough and costs will go down.
And it's just like anything else
that you produce or manufacture.
Somebody will eventually figure it out
if there is a need for it.
I guess it just wasn't GT Advanced Technologies this time.
It was not, or maybe it was them.
And they have the best solution possible.
And Apple was like, no, we don't want to cut into our,
you know, our markup, our margins that much.
So we're going to wait for our next generation.
So maybe the 6S will have sapphire, or maybe not.
I know, so.
Who knows with Apple.
Yeah, that's true.
Speaking of the Gear Live,
how do you find the smartwatch experience
without a multi-day battery?
Because I've stuck by the Pebble Steel,
especially at that new $200 price point
and with that new third-party app
that's giving you continuous,
I think it does continuous fitness monitoring,
not possibly to the same extent
that some other stuff can do with heart rate and all that.
But I freaking love this thing.
And I love that even as a very heavy user,
I can go four days without charging it.
Are you okay with one day?
I would like to not be, but I think I am.
You know, I guess my general rule is
I'm going to charge my phone anyway.
So plugging in the watch right next to it
doesn't really make it that big of a deal.
There have been several instances
where I've like taken off my watch on my dresser
instead of my nightstand where my chargers are
and forgot about it and woke up the next morning
and had a dead watch and got, well, that's stupid
and wished that it had had longer battery life.
But I wish that for my phones every day as well.
So does better wireless charging address all of this
in the next six to 18 months anyway?
Depends on what it is.
Like, so wireless charging would help, right?
Because the main problem with the Galaxy Gear Live at least
is that it doesn't have a standard USB port for charging.
Yeah, it has that stupid cradle, right?
It's got the cradle and it has the cradle
because of the water resistant necessary, right?
You know, you don't want to ruin your watch
when you wash your hands.
So, you know, and not only having that one charger
means I don't have one at the office
and I don't have one in my car
and I don't have one at the house all at the same time.
Scrub.
Yeah, I don't know if you can buy them extra,
but I'm sure you can.
But I forgot the question now.
Is a day of battery life okay?
It's okay.
Like it sucks, but I would like my phone
to go longer as well.
But we all made the sacrifice.
When I had a Blackberry,
I could go- Wasn't it awesome?
I could go three days on a phone.
That would be great.
My old Nokia brick?
Yeah, I decided to sacrifice features
and functionality for battery life.
And now, as people in this office
would be able to attend to,
I am desperately looking to go the other direction
without sacrificing features.
Like I have a GS4 now, the battery's getting old.
It's starting to lower its battery life.
I would love to have a phone
with the same level of performance and feature set
that has like two days of battery life.
Don't give me a phone that has new stuff.
I don't necessarily want the new stuff.
I want a phone that's gonna last me.
And if I get drunk and pass out at a friend's house,
I don't wake up with a dead phone all the time.
Not that I do that all the time, just in case.
Question for you.
And you know what?
We've gotta do a straw poll on this.
Let me just get a straw poll going here.
Do you think that anyone will have the balls
to do a two-day or a three-day battery phone
and sacrifice the thinness and sexiness
that we've come to expect?
I was really hoping Apple would do it,
because Apple's been such a pioneer
when it comes to battery life.
I was really hoping that the 6 series
would maintain the same thickness as the older phones,
compromise on weight a little bit,
add a little bit of weight,
and put a similar size battery
to what a flagship Android phone is doing.
So maybe where the 6 Plus would have had
a 3,200 milliamp-hour battery,
like something like an Xperia Z2 does,
where Apple, with the way that they sip battery,
especially when idle or with tasks running in the background
compared to Android,
would have legitimately been able to deliver a multi-day,
at least a two-day battery.
Do you think anyone's gonna have the balls,
or are we gonna have to wait for Project Aura
and for people to just build it themselves
out of modular components?
I want to believe that somebody
would have the guts to do that,
because it's not really that,
I mean, as many Android phones as there are out there,
as many options as HTC makes or Samsung makes,
like, take the Galaxy S5 and call it, you know,
the Galaxy S5 Business Edition or something like that, right?
And you make it a half-inch instead of whatever else, right?
You make it a quarter-inch thicker
and you give it like a 4,500 milliamp-hour battery
or something like that, right?
I would buy that phone,
because I have normal-sized pockets.
I'm not, you know,
I don't have an issue with the size of phones
and I don't carry a purse or anything, but I'm okay.
I would much rather sacrifice that.
You look at the super thin iPhone 6 and 6 Plus,
and they're beautiful-looking designs.
They really are beautiful.
But add two millimeters to it,
three millimeters to it, four millimeters to it,
how much do you extend that battery life?
I'd love to see some kind of math on that.
And I'm sure you could do it without too much work, right?
If you add four mil of battery to that device,
what does that equate to in milliamp-hours?
And then what does that equate to
in actual real-world usage?
And the thing is because-
How many people have Mophies, right?
People buy Mophies all the time
and it makes your phone a brick.
Yeah, exactly. They do it for battery life purposes.
Exactly.
And I mean, the thing is that these companies,
a lot of the time I feel like guys like Samsung in particular
they just kind of throw things at the wall
and hope that they stick.
And they don't really even seem to know
what they're doing a lot of the time,
where they just release a dozen galaxies
and then one of them will have sentient life in it.
You know what I mean? Yep.
And it seems like we're giving it to them this time.
We're handing them the solution.
Consumers are demanding this.
And with the way that charging technology has improved
over the last few generations,
where you can charge high capacity batteries
much more quickly,
are they just not doing it
because it adds so much to the bomb cost of the phone?
Like, is that the issue we're seeing here?
Probably some of that is, right?
Batteries aren't necessarily cheap.
And it's, you know,
depending on how many you're gonna make,
it's an upfront expense, right?
And if it's a huge flop,
you have to eat that cost.
I was one of the people that when I had a Palm Pre,
way back in the day,
like I bought- You're so young and hip.
I know.
I bought like the giant,
I forget who made that battery, the off-brand CDO.
Sure.
Never heard of it.
Off-brand battery. I'm too young.
It was like a big battery and it had a different back on it
that was way fatter than everything else.
And I thought, this is dumb looking,
but man, I can go two and a half days
without having to recharge the phone.
So this types of things still exist.
I'm trying to think what was the last phone I did that on.
The Nexus 5.
No, no, Nexus 4.
Google Nexus, one of those.
I bought like an extra large battery
that had to have a different back case to it
that made it look dumb
because it was very obviously not what it shipped with,
but it was a much more usable device because of it.
You know, at least, okay,
I'm gonna end this topic with an anecdote here.
At least it's not as bad as the really old days.
Did you ever have a pocket PC?
I did, yes.
Okay, did you ever have one of the pocket PCs
that actually didn't have persistent storage?
I don't think I did have one of those.
I had an HPI pack that actually the onboard storage,
not the RAM, I'm talking storage,
the onboard storage did not retain data
if the device fully discharged.
And so it actually had two battery meters on it.
There was like the usable battery meter
and then there was the reserve auxiliary battery meter.
So once you ran out of battery,
the device would power off just like it was off,
but then it had a reserve there
that would last for about five days, I think,
or something like that.
And if you lost all of that,
you had to plug it in and charge it
and it was factory reset.
Wow, I did have an iPad.
I don't know if I ever had one that did that,
but I don't remember that ever happening.
So I don't know, that's weird.
Yeah, it was super, super stupid.
Speaking of super, super stupid,
I'm gonna do our sponsor segments here.
So what's not stupid is lynda.com.
You can actually make yourself smarter.
Actually, okay, I shouldn't say that.
Making yourself smarter is a little bit challenging.
You can actually increase your IQ
by exercising your brain.
And you can exercise your brain
by using lynda.com to learn new things,
but lynda.com is more about acquiring new skills,
whether that new skill is digital photography,
or whether that new skill is something like video editing
or programming.
You can learn all kinds of great stuff on lynda.com.
Their courses are being refreshed all the time.
They're taught by experts.
They have guided paths that you can follow through
to learn new things and become truly proficient at them
to the point where you might even get
a new career out of it.
It's super affordable.
And if you're not sure if it's right for you,
all you gotta do is go to lynda.com slash wan show
for a free seven day trial.
It's all you can eat.
You can try out as much different stuff as you want.
And if you don't find anything that excites you,
then hey, you cancel it.
If you do find stuff that excites you, then great.
Sign up for a membership and start learning.
We've actually got now three employees
at Linus Media Group who use the skills they learned
with lynda.com daily at their jobs.
So it really does work.
It's really awesome.
It's one of the, actually it's one of the sponsors
that I get the most testimonials about.
People tweeting at me, hey, I'm on lynda.com.
Thanks for the recommendation.
It's awesome.
So there you go.
Now this one, Phantom Glass.
I don't get nearly as many people telling me
that they have it and it's amazing.
So my guess is that you guys haven't tried it yet
because it really is awesome.
I've got Phantom Glass on my One M8.
You would never know it from touching it.
I was actually showing this to someone.
Where was I?
It was at one of the factory tours.
Yeah, it was at Sennheiser.
We were talking about using high quality materials
in hardware and for some reason, yes,
we were talking about some device
that had a touch sensitive glass thing, glass panel on it
that was in the room where we were getting our presentation.
And I forget how we got on the topic,
but they were talking about how screen protectors
or really anything but Gorilla Glass
or similar high grade glass coatings
really feels awful when you're using it.
And so I pulled up my phone and I went,
well, I bet you don't know
that I actually have a screen protector on here
that's made of Gorilla Glass.
So the way that it works, and we did a review video of it
that ended up being the springboard
for the sponsorship relationship
because I loved the product so much
that we decided to reach out to them and go,
well, hey, should we work together?
Because I'm totally willing to endorse this stuff.
It's freaking awesome.
It's got a nano coating on the backside
that's actually removable and can be reapplied
if for whatever reason you wanted to do that,
but it never comes up on its own.
It's actually quite hard to take off.
The only reason I took it off once was
because when I was putting it on in the first place,
I put it on in the wrong spot and I was like, oh crap,
there's no way I'm gonna be able to take it off,
put it back on, it's actually gonna work.
It did, and then the front of it
has the same oleophobic scratch resistant properties
as Gorilla Glass 3 because that's exactly what it's made of.
The only thing you give up is that there is gonna be
a slight ridge around the edges of the screen,
but it's thin enough at least
that you can use it with most cases.
In fact, I think they advertise all cases,
but I hate to say that
because I'm sure someone somewhere will find a case
that it doesn't work with,
but it'll work with most aftermarket cases
without any gaps or any sort of weirdness.
So guys, check it out, store.phantom.glass.
They're gonna be sponsoring us at CES this year
and we're gonna be doing either,
I'm not quite sure which it'll be yet,
but we're either going to do a dedicated video
where Luke and I run around just going into booths
and rubbing our phones on crap
and getting people's reactions,
or we're just gonna do that randomly
during interviews with people
and just kind of cut that together into a funny montage
because it's that protective.
Not only is it very scratch resistant,
but you can actually replace it
and your screen underneath will be perfect, of course,
still just like any other screen protector,
except that it doesn't look like ass and feel like ass
because you know, if it looks like ass and feels like ass,
it's probably ass.
Just like that whole, that duck saying.
Looks like a duck and quacks like a duck.
Anyway, the point is,
I think we should move on to our next topic here.
Thanks to our sponsors, linda.com and Phantom Glass.
And you're back, right?
Were you promised to wear a costume when you do that video?
I probably should.
What would be, like a ghost costume?
Like Phantom Glass.
Ooh, ooh.
Sure, maybe Pac-Man ghost, you know, there you go.
Maybe something more comical.
I don't think people mostly cosplay at CES,
but we could start that trend.
You could try, you could try.
Yeah, would you do it if we did it?
Would you cosplay CES all your meetings
and all your show floor visits?
Probably wouldn't, probably would not.
All my meetings, I don't know.
Maybe once or twice, but maybe not all of them.
I go to all the parties with you dressed up like that.
How's that?
Speaking of Glass, ethnod posted this on the forum.
The original article is from Digital Trends.
I'm just gonna go ahead and screen share here.
Google Glass can now add closed captions to real life.
Absolutely fascinating technology.
I mean, I think we all saw stuff like this coming.
It's just augmented reality.
That's all it really is.
But the fact that we're getting there
with this frankly very rudimentary hardware
that we have now is extremely exciting.
So the idea is that you would be able to
maybe not necessarily transcribe an entire conversation
perfectly for someone who is, let's say,
completely, completely deaf,
has no hearing in either ears.
But if you've got someone who's hard of hearing
and misses a word here and there,
or even in some cases people who speak different languages,
you would be able to communicate with each other.
That's really the killer feature for this, right,
is going out of the country.
I go to China.
I don't speak Mandarin.
I can have a conversation with somebody
if we are both wearing these devices.
It's gonna require more processing than we have today.
It needs to be perfected to a certain degree.
And roaming needs to go away
because you're not gonna be using a feature like this
without your internet connection.
That's true, that's true.
The world's got a long way to go yet, but.
It does.
I had Glass initially and used it for a little while,
and it had the ability to translate signs, right?
There was an app that could do that, right,
where the idea was road signs or street signs
that gave mileage or directions or street names
or something like that,
and a different language would be translated for you.
And it worked okay, but it was something like
where you needed to be very still
and you had to point the camera in a very specific way.
And this seems to be a more useful implementation
of that idea, right?
Because I mean, Google,
if anybody has voice-to-text transcription down,
it should be Google, right?
All the Google voicemails they do,
all the closed captioning they do on YouTube videos
and all the corrections that they do for that stuff,
they should be pretty good at that.
You would think that.
I actually had a pretty stupid experience
with Google Now earlier today.
I think I asked for directions somewhere
and it Googled directions to wherever.
Like it still does that for me about 40% of the time.
I hate Google Now.
Really? It's infuriating.
It's way better than what Siri does.
We have this debate in our office all the time.
No, no, no, no.
Yeah, it is, yeah, absolutely.
I'm sure the audience doesn't want me
to do the whole Siri versus Google Now thing again,
but why don't we just say this?
I disagree with you and we'll have this debate
next time you're on the show, okay?
Okay, all right, fair enough.
All right.
I'll bring my best Google Now phone
and we'll have a speech off.
We'll have a speech off?
I would actually be interested in that.
We'll see if we can stump each other's phones.
Yeah, that'd be good.
That might be kind of fun.
All right, speaking of things that are fun,
customer information compromised in an AT&T insider breach.
This was originally posted by Dietrich W
on the forum and the original article here is from CNET.
I'm just gonna pop this up on the screen.
AT&T warns 1,600 customers of data breach.
So basically, this wasn't a systemic problem necessarily.
It wasn't that AT&T was intentionally
collecting this information and sharing it
with crappy folks or whatever the case may be.
This was a case where a rogue employee
went and stole this information.
And I think this underlines one of the real issues
with data collection.
The issue is not necessarily the terms of service
because this is clearly against AT&T's privacy policy.
The issue is that if you're collecting the information,
someone somewhere may have access to it
and they're the problem.
That's not a policy issue.
That's just a people are people
and bad people are bad people.
So I don't really know what to say.
The employee's been fired
and everyone affected has been contacted.
But I mean, what do you think, Ryan?
Should we go tinfoil hat here?
Are we afraid of people having our information
or should we trust companies?
I mean, this is really the oldest form of this
that you'll ever find, right?
Go back, I mean, bankers have had access to your money
for 100 years, right?
And this is not going to go away.
If anything, the computerization of these processes
will take people out of the process,
out of the pipeline of these things occurring
and so you have less instances of this.
That doesn't make you more safe, of course,
as we've seen data breaches in other ways.
This is more of a, hey, you had a really crappy employee
but that could happen at a hospital,
that could happen at a bank,
that could happen at a restaurant
where the waitress has a skimmer in her pocket, right?
Right, yep.
Those are all personnel issues
and there's always gonna be people in this world
that are trying to do those types of things, right?
So would you prefer a digital system
versus a human intervention system in this case then?
Man, I don't know.
I gotta say, the scope of the theft here
is so small compared to what happens
when something goes wrong with a digital system.
True, they only got 1,600, right?
1,600 people had their information compromised
and it sucks to be you, it sucks for those people
but we're not talking 100,000 or 160,000
people with information compromised.
What will happen, who was it?
Crap, was it Target that had a major problem
with their POS system?
Thousands and thousands of customers' information
was compromised, what was it, credit card numbers?
Yeah, I think so, yeah.
Can't remember the exact details but.
I know recently it was Home Depot and Jimmy John's,
like my bank called me a couple weeks ago
and said, hey, we're sending you a new
checking account card and I was like, why?
He said, well, we looked and you'd shopped
at Jimmy John's and Home Depot in the last 60 days
so you're getting a new card.
I was like, that sucks but hey,
thanks for being proactive about it, I guess.
It's interesting, there was a service
that I saw a Kickstarter for recently,
I can't remember the name of it,
where it automatically generated a unique
credit card number for each merchant that you shopped at,
for example, and then if that merchant
ever had a security issue, that number
would be automatically canceled, right?
And nobody else had access to it
so you didn't have to go through
and change your card information
and change who pays your bills
or change all those other companies
just because one merchant screwed up.
It was an interesting idea, I don't know exactly
how it worked necessarily but that is.
That's really interesting.
Yeah, I'm trying to remember what it was called.
I'm sure somebody in the chat will know
eventually Level, I wanna say Level.
Level credit card, I don't remember what it was.
But it was a Kickstarter that basically
had this service, right, so anywhere you shopped online
or even if you called somebody on the phone,
you could generate a unique number
for that merchant through their app
and give it to them that way.
Obviously, if you have to swipe your card,
I don't think that would work.
Right.
It seems like a good idea. That's incredibly cool.
I mean, I think we're gonna have an issue
with there only being so many credit card numbers available
and we might have kind of an IPv4 versus IPv6 type of issue.
We're gonna have to start adding like alpha characters
to credit card numbers.
I mean, this is gonna happen eventually anyway
but I mean, that's a really interesting approach.
It would have to be done by more than just one app maker.
Like it would have to be the banking system
actually implementing this at a much higher level
where each individual has, you know,
a thousand or tens or even hundreds of thousands
of potential credit card numbers applied to them
and that would be a much faster way
to narrow down data theft
and particularly financial data theft.
I think it would be much less likely to happen
in the US than Canada though
just because our central banking system
allows us to do pretty much everything faster than you guys.
You know, whether it's chip-based cards
or even pins versus signatures and all that kind of stuff.
Yeah, we don't have chip and pin here yet.
They always say it's coming soon but.
Yeah, sure, whatever.
I encountered one and this is actually
kind of a funny story.
If you don't mind if I just kind of
Sure.
mumble for a bit here.
I was down at PAX and I was on my way back
and I realized I didn't have enough gas to get home
so I wanted to get something while I was in the city.
I went to the gas station and I went to pay at the pump
and it prompted me for my zip code, of course,
I don't have a zip code, I'm not American.
So I was like, oh crap, right, this song and dance again.
So I have to go back, I have to go into the station
and I go to pay the guy, he asks, how much do you want?
I say, oh, I don't know, man, fill up.
And so he hands me the card thing and I go to swipe it
and he says, oh, no, no, don't, right,
we're having a conversation before this
and he asks where I'm from, I tell him I'm from Canada,
right, so he hands me the thing, I go to swipe it
and he says, I know you're not Canadian.
And I kind of went, what?
He says, no, I know you're not Canadian.
I'm like, no, I'm Canadian.
And like here we've got someone I'm trying to use
a credit card with telling me they don't believe
my identity so I feel like I have a bit of a problem
here right now and so he says, do you know how I know?
Sure, humor me.
He goes, because you didn't try to put the chip reader in.
And I go, well, okay, Sherlock, I'm in America
where they generally don't have chip readers
and so I went to swipe it and it's really,
you're not that much of a detective.
But he looked at my ID really closely.
Wow.
It's just like, okay, whatever, man.
We'll get those one day.
Sure, why not?
You guys will catch up to Canada someday.
Good luck with that.
Story of our life.
Speaking of catching up with Canada,
this segue makes no sense.
Bridgestone releases air-free tires
you never have to inflate.
This was posted by 13CA350 on the forum
and the original article here is from CNET.
This is actually not the first time
we've seen any kind of airless tire attempted
but this is the first time that Bridgestone
is really saying that, hey,
these might actually be not that bad.
This is their second gen tire
and if you guys have a look at the images here on CNET,
it actually looks pretty cool.
Every part of it is recyclable
and shock absorption is handled by the shape
of the spokes here and yeah.
I wonder, I'm sure they do talk about it
because they mentioned shock absorption
but I have run-flat tires on one of my car
and they are noticeably less, let's say, good
in terms of ride quality, right?
Especially when there's no air in them
because I've had a flat tire and a run-flat before
and it's very stiff after that
so I imagine, I mean, the design looks pretty cool
so I'm sure they're taking that into consideration.
Yeah, they have a replaceable tread on them
which is another really cool thing.
I mean, this looks like as much a sustainability improvement
as it is a car maintenance
and potentially safety improvement.
I mean, I think it'll be a long way
before we see any kind of a commercial product
like something that you would put on a semi-truck like this
but for consumer vehicles,
other than the fact that it's so ugly,
I would never consider putting it anywhere near even my car.
It's pretty exciting technology.
I'm sure they could make it look better.
They could hide the weird stuff on the side
with some kind of design, return of hubcaps, let's do that.
Return of hubcaps, yeah, yeah, if we hid it enough,
that could probably work.
And then the return of hubcap theft.
Yeah, okay, that too, yep.
Or when they fly off on the highway
and smash into somebody else's windshield,
there's that too.
Yeah, or that.
Maybe we could have like a really nice plastic hubcap
that just looks metal,
like maybe we'll really get the hang of fake metal.
They'll finally figure that out, yeah.
Just spray paint it, it's fine.
So speaking of things that we're finally figuring out,
we still haven't finally figured out
that people want more battery life in their phones,
but HTC, and this was posted on the forum
by opmonkemathew78, HTC has finally figured out
that some people use their selfie cameras
more than they actually use their other cameras anyway,
so why don't we just produce a camera
that has a front-facing camera
that's just as good as the back one.
The HTC One M8i has two 13-megapixel cameras,
one on the back, on the front, it takes 1080p video,
it even has a two-tone flash on the front,
so you can really take pretty much, almost,
the same quality photos with the front camera
as you can with the back one.
Ryan, do you take a lot of selfies?
No, I do not, however, this makes a lot of sense.
Like, I think, what's funny is,
I think this makes way more sense
than the phones that had the tiny screen on the back of it.
You remember those, which one was it,
that had a small screen on the back
so that you could see what you were taking a picture of?
I remember that. No, maybe it wasn't a phone,
maybe it was a snapshot camera.
No, I had one, I had a, ah, shoot, what was it called?
What was it called?
I don't know.
But the fact that we even went through that process
was like, I know, we'll put a screen on the back,
and when in reality, what we should have done
is put a better camera on the front.
I guess it determines which one was more expensive.
This is the Samsung Jive.
I have one of these.
And it was a flip, so here's the camera here,
I know this is a really small image,
and it had a small screen on the front
for previewing messages and seeing the time,
and I believe I could also use that front screen
to line up photos.
Yeah.
If I recall correctly.
I think there was like a non-phone snapshot camera
that had that as well, maybe by Canon or Nikon or somebody
that had like a little screen on the front
right next to the lens so that you could see
what you were pointing it at when you did that.
But yeah, this makes way more sense.
You've got a really nice screen already on the phone,
and just take that camera and add another one, I guess.
I don't know, is this an exceptionally expensive phone
by any measure?
I don't know.
I don't think so.
I don't think it really affected the cost too much at all.
I'm setting up a straw poll here, though, guys.
I wanna hear from you.
How many selfies do you take per week
if you're willing to admit it?
Remember, these straw polls are anonymous.
Yeah.
Or as anonymous as anything is, I guess.
Somewhat anonymous.
So anyway, let's just see if there's anything here.
I have to, okay, so it's gone quietly on sale in China
for the equivalent of 652 US dollars,
and will apparently only be a China and India release,
at least for the foreseeable future.
Why do they get cooler stuff than us?
Dual SIM, better selfie cameras?
Okay, at least dual SIM is cooler,
but why do they get better stuff than us?
Our cell phone carriers are a pain in the ass.
Oh yeah, I guess there's that, right?
So it's kind of Wild West out there, is my understanding.
Yeah.
All right, so it features autofocus.
Oh, there's another camera, too.
There's the Desire Eye.
So that one is also a selfie optimized phone.
So they're both coming out.
Now we gotta have a look at the straw poll.
This was the one straw poll
that I knew I had to do in this video.
Wow.
I bet I can predict the results.
So the number of you that,
maybe this is why we're not getting these cameras,
because 83% of you say none you in response to.
I think our audience on this show
is very different than the audience
for any of these phones.
However, I know there are a lot of 12-year-old girls
out there that do this all the time,
because I have a niece that does it.
Every time she sees me, she's like,
oh, let's get a picture, and she holds it up,
and she puts it on Instagram, and I go,
I don't know what's going on anymore.
Kill me now.
I'm 32 years old.
What are you doing to me?
That type of thing.
I don't feel like I should feel this old yet.
Yeah, yes, exactly that.
But I bet there's a pretty big market for it.
I don't know.
You can sell anything.
Why not?
And yet we get the selfie phone,
but not the high battery capacity phone.
Are you freaking kidding me?
So sad.
Yeah.
Speaking of things that are sad,
Steam releases Canadian pricing.
Some game prices increased, others remain the same,
and some games currently unavailable for purchase in Canada.
No.
I don't know what the issue is
with Steam pricing just being in US dollars.
I mean, maybe, you know what?
Great opportunity to do a Twitter blitz.
Guys, hit me at linustech, and let me know
if you can think of some reason
why it's not perfectly okay
if ultimately they're just gonna be converting
their pricing and then displaying it
in your own native currency anyway.
Why it matters at all that they don't just have US pricing,
because at least if it's a straight conversion,
it's ultimately up to my credit card company
how much I'm paying in terms of fees,
which is usually not more than a couple percent
over whatever the nominal exchange rate is
that you would get at the bank.
Why do I need them to display it in my currency?
I mean, they were already displaying a conversion.
Well, okay, no, no, I don't think they did, actually.
But whatever, xe.com is not exactly
a difficult site to remember.
So before this, you just purchased it
in US dollars, right?
Yeah, and then my credit card statement would say,
you know, 5721 Canadian, $54.99 US in brackets.
Now, you said that there were some games
that were no longer available after this?
Yeah.
I don't understand why that would be the case.
They're probably gonna get it fixed,
but my guess would be that they rolled it out
and it would have to do with something like
agreeing with the game developer
on what the price is gonna be in the new currency,
for example, because a lot of stuff is not,
actually, you're American, you might not know this.
A lot of stuff is not done just on a straight conversion.
For example, books are notorious for this
because they'll have a cover price, right?
And you probably don't even have the Canadian cover price,
but we usually- It shows both on our books.
Okay, you guys get both as well.
So you'll see $4.99 US, $7.99 Canadian sometimes.
Significant price differences, I suspect.
I just thought that it was the tax
for living further north, I don't know.
No, no, it's just, it's based on,
some companies actually lock their exchange rate
as rarely as over multiple years.
I remember back in 2000, shoot, I think it was 2008, 2009,
when the currency was all over the place.
And I forget who it was, but there was one Canadian company
that only redid their exchange rates,
I think once every two years or five years
or something like that.
And they basically sat there losing money
for, I think, six to 10 months before they finally wised up
and went, oh, this isn't gonna change,
so we should probably do something about our pricing.
So yeah, NBA 2K15 went up 16.6% from 59.99 to 69.99,
which isn't that reasonable.
Overall, the transition looks more smooth
than when they added euros in 2008,
where many games retained the same numerical price,
but just in euros, which sucks.
I'm curious, on Steam, when you download
in Canadian packets over Steam,
does it download just as fast as you would expect?
Does it download just as fast as you would expect?
Does it saturate your bandwidth there, does it?
Okay.
Yep, I mean, we're so close.
I mean, I can spit to Valve.
I'm closer to Valve than you.
Like, in terms of server locations,
there's no penalty for the border,
and the north to south connections are excellent.
Okay, all right.
Yeah, I can play on Washington-based servers
in a game with very, very low ping.
We always joke about Jeremy on our podcast,
lives in Vancouver, and we always joke
that whenever he has Skype issues,
it's because of Border Patrol blocking packets across.
Blocking packets.
The scary thing is that,
normally, I would kind of laugh at you,
and I'd go, ha ha, Ryan, that's a funny joke,
except with all the stuff that's been going on
with the NSA in the last year,
I wouldn't even be that surprised to find out
that someone is actually manually sorting packets
at the border.
Well, no, this one's fine, this one's fine,
this one's fine, nope, not that Linus guy.
This one's fine.
Yeah, not this guy who's tea-bagging the other player.
Let's just go ahead and turn these packets off right now.
Disconnect.
Yeah, jackass Canadian.
All right, so I've got one more topic
that I'd like to discuss a little bit.
This was actually posted by ethnot on the forum.
The original source was a little site
you may or may not have heard of before
called pcpr.com or something like that.
I don't know, some American guy runs it
or something like that.
Arm and TSMC apparently headed for 10 nanometer.
Dun, dun, dun.
What does this mean?
What's gonna happen?
Man, you know, it doesn't mean jack.
The problem is, so-
I'm telling Richard how to be on you.
Yeah, well, TSMC is a manufacturing facility
that makes chips for just about everybody.
AMD on the graphics side
and on the processor side somewhat.
And also Nvidia and also Apple and others
and everybody, right?
Their problem, their main competitor is actually Intel.
Intel fab stuff, primarily for themselves.
They have a couple of odd jobs every once in a while.
So the issue is, is we're at 22 nanometer today
on Intel's front.
And we're still kind of stuck on 28,
kind of 20 nanometer on TSMC.
Hold on, Intel's rolled out 14?
Sort of.
Right, Broadwell is 14, right?
And it's just, I guess it did kind of
just actually start shipping in notebooks this month.
So it did, you're right.
But in the 20 nanometer from TSMC is kind of a,
it's an odd conversation
because it's very limited production
and it didn't have the benefits
that a lot of people had expected to see
for high-performance parts.
That's why the Apple A8 is being manufactured on it,
but no Nvidia or AMD GPUs are being manufactured on it.
Even though you would think
those would be the perfect ripe example
for what should need a die size decrease
in process technology.
Better density, lower power consumption,
faster switching speeds.
That sounds right up the GPU's alley.
Right, so what needs to happen now is TSMC
and those other groups are going into FinFET production,
which is Tri-Gate 3D transistors,
which is what Intel introduced in 22 nanometer.
And they're still prepping
like they're 16 nanometer FinFET parts,
TSMC and GlobalFoundries and those guys are.
And that will be kind of the first iteration of it.
And so I think what happened was,
this was on a Digitime story,
kind of came out around the ARM TechCon convention
that was happening out in Santa Clara,
that they were, hey, look,
we're on track for our 10 nanometer taping out
possibly in the fourth quarter of 2015.
And so what a tape out means
is that you have manufactured a chip
to a specification that you approve of
and so they can begin the full manufacturing process.
And sometimes you'll tape out two or three revisions
before you actually start the manufacturing process.
So, they're talking about end of 2015 for 10 nanometer,
which means realistically mid to late 2016
before you actually see parts using that.
But I think we should be more concerned
about how quickly they get to 16 nanometer
and what that actually does.
Will we, some people predicted that Maxwell,
NVIDIA's new chip is built on 28 nanometer.
We all thought, well, six months ago
that it would be built on 20 nanometer.
That wasn't the case.
Whether it be capacity issues,
whether it be technology issues,
for some reason it wasn't built on 20.
I think at this point we all kind of assume
that they're just going to wait,
they're going to leapfrog 20 and go down to 16
whenever that becomes available.
AMD, the rumor is now that their next chip
will actually be using 20 nanometer.
And so we'll be able to see a comparison
of what a chip can do one way or the other.
And we may actually start to see AMD and NVIDIA
on opposing nodes of production, right?
Yeah, that'll be really interesting.
I mean, that hasn't happened
for an extended period of time in a long time.
So with AMD at a smaller manufacturing process
than NVIDIA for potentially months.
Right, yeah, it would be very interesting.
I, Josh, who is on our show at PCPro.com,
he is a manufacturing guy.
He knows all about this stuff way more than I do.
And his kind of theory is that the 20 nanometer
probably doesn't offer a very big power consumption
or clock frequency advantage over 28 nanometers.
It's probably very small.
And so the complication of moving your product
from 28 down to 20 may not be worthwhile in the long run.
So that would be NVIDIA's.
He's saying that the cost benefit didn't look
like it made a ton of sense for them as well.
Yeah, because when they're talking about a cost benefit,
they're not just really talking about the dollar benefit,
but the dollar they have to spend per wafer,
what your yield is on that wafer
determines what each GPU's value is,
and then how much time you have to spend,
which is engineering money,
to convert it down to 20 nanometer and fix it.
Whereas with 28, they were very comfortable with it.
They already knew it.
And they were able to build, I think,
a pretty compelling part based on it, so.
Yeah, I'm surprised at how strong GTX 980 is.
I wasn't necessarily expecting them to be able
to do that much with it without a die shrink.
So as much as it's not what I really wanted,
which was GM 200.
Yeah.
Girl can dream, right?
It's not a bad part.
Just have to wait for 210.
Just have to wait for 210.
Just wait for GM 210.
It'll be fine.
Fair enough.
I'm sure it'll be here soon.
Yeah, I'm sure.
I don't know.
All right, well, I think that's pretty much it.
What time is it, your time right now?
It is 9 18, my time.
9 18.
Well, I'm sure your wife is gonna be super thrilled with me
and I'll get more glares the next time
I run into her at an event.
No, she was super nice.
She likes you now.
Now that she knows you, she's totally fine with it.
I said, hey, I'm gonna go be online.
Finish the show on Friday.
Is that cool?
She said, oh yeah, that's fine.
You did make your show earlier.
It used to be much later.
Didn't you used to do it at 7 p.m. Pacific
instead of 7 p.m. Eastern?
Yeah.
I did.
That was way more complicated
because then I was getting sleepy by the end of it.
I'm an old man.
I get tired.
All right, well guys, thanks so much for watching.
Thanks to our sponsors, lynda.com and Phantom Glass.
And thanks to our special guest,
Mr. Ryan Shrout from PC Per.
For those of you who are just tuning in
later on in the show here, Ryan,
do you wanna give them one more sort of
where to find you?
I absolutely do.
I absolutely do.
So pcper.com is our website with all of our reviews.
I'm a self promoter.
I'm gonna plug my YouTube channel as well
because we're all there.
Look, you have so many more subscribers than me.
It's not even a competition.
If you wanna watch videos about PC hardware,
our channel is youtube.com slash pcper.
I mean, we don't do as cool of videos as Linus does.
He doesn't walk around on his roof
with a hundred foot USB cable.
I did.
No, I didn't do that.
I did run a 500 foot ethernet cable once.
Okay, all right.
That worked successfully.
That's almost as cool.
And it got run over by a lawnmower.
So there was that.
And then it didn't work.
That's unfortunate.
Go figure.
So just those places.
I mean, if you wanna find me on Twitter,
it's just at Ryan Shrout as well.
So anytime you wanna have me back on, this is fun.
It's cool.
I like talking with a different group.
I did see a couple of people in the Twitter chat
asking me to tell you all about the wonders of IRC.
Oh, oh, okay.
Yeah, because you guys use IRC for your chat, right?
Yeah, we do.
I think it's part of our partner agreement with Twitch
that we don't use alternate chats.
Fair enough.
Fair enough.
So there's one reason why we might do things
the way that we do things.
Something that I think viewers a lot of time don't realize
is that there's probably a hundred reasons
behind the scenes why we do something the way that we do it
versus what's immediately obvious and apparent.
But yeah, that's the reason we don't do it.
And you know what?
Twitch chat isn't that bad.
No, I enjoy watching it.
I enjoy participating in it.
So I'm in there.
My name's popping up every once in a while here
and there we go.
I see a lot of people tweeting about how they hate you
and they never wanna see you as a guest again.
I understand.
I know I get it all the time.
I see why you're trying to invite yourself back
to try and like counter what they're saying.
No, we'll definitely have you back again, Ryan.
Thank you very much.
And good night, everyone.
We'll see you again.
Same bat time, same bat channel next week.
I actually don't remember who our guest is for next week,
but I think we might be having Paul and Kyle,
formerly of Newegg TV back.
So stay tuned for that and good night, everybody.
Sorry, it takes a while to switch scenes because...
Oh, and I've got this thing over here.
Doo, doo, doo, doo, doo, doo, doo, doo, doo, doo, doo,
doo, doo, doo, doo, doo, doo, doo, doo, doo, doo, doo, doo.
Ah, why is this the one with...
I gotta just delete the one with no audio at some point.
That would make a lot more sense.
Because I keep accidentally putting it here
and then I never check it because I'm usually like
throwing myself into my chair here to get the show started.
Today, I arrived back at the office at 2.30
and I shot an unboxing as well as a full line of tech tips
and then some B-roll for another one
before starting the show.
So like, it's just...
You did good, you did fine.
We didn't need a Skype pre-test.
We got this stuff down.
Yeah, that's right.
Skype, what could go wrong?
Nothing goes wrong with Skype.
And turn...