$2,500 RTX-5090 ( 60% faster than 4090 )

Eye tracking would be another one of those tech that (specially if the trend of monitor getting bigger for gamer continue and obviously already there for VR), there 1-2 degree of vision that need to be high resolution (and some margin of error around), specially if the system get 1000 fps fast to react quick enough to a fast eye change, we could be in 300 dpi on a very small portion of the screen and everything else in some old school 720p density and quality I would imagine, where our eye only see movement and bright change of light.
Yea that too.

I believe the Varjo VR headset has a functional implementation of that already, with some caveats of course (did not personally test).
 
Last edited:
This has been floating around on reddit today.
1715278172842.png


If those specs are true the 5090 will be 60% faster... 50% more SM/Cuda Cores/RT Cores/Tensor Cores, 15% Clock increase, 77% more L2 cache, 50% more memory bandwidth.
 
AI reconstruction and frame gen are the future, not pure raster improvements. The 4x stronger GPU will release when we have also have higher refresh rates and resolutions and more complex games, so suddenly you'll need 10x more power again or whatever (random number, but you get the idea).

Not saying we have hit the limits of raster, but when you see how much more power hungry GPUs have become you could actually argue that we have reached some limits already.

And AI reconstruction and frame gen make perfect sense when you think about it for a minute: there's an horrifying amount of GPU/CPU cycles wasted on things human cannot see/perceive whatsoever (it's why those techs work so well, even if they're not perfect yet), so it's essentially just more software optimization and it makes the dream of 1000fps/1000hz + ultra high resolution actually imaginable in our lifetime.
Yes, as you point out, the hardware will be catching up for a long time as it’s a moving target. Upscaling, frame gen, and eye tracking are each good for about a 30% effective performance improvement before they start to noticeably degrade image quality if developers have the time and resources to implement them properly. That's very good, but it still puts the hardware playing catch-up for a long time to come. It makes me reluctant to spend a lot on a GPU in this climate. The 3090 got clobbered by the 4090, which is looking to get wrecked by the 5090, and even a ~$2,500 5090 probably won't be able to max out my display hardware in some of the games I play. Even at the best of times, buying GPU’s is like investing in a melting ice cube, but in this climate it’s 100 degrees outside, so I’m not willing to pay too much for my ice.
 
but in this climate it’s 100 degrees outside,
Not sure if GPU ever had longer life time, ability to keep value and longer period before a refresh than now.

Buying a GeForce 3 was much more a melting ice cube, 6 years old 2070 super play every game quite well and will continue to do so for popular AAA game until PS6 only game show up, around a decade later someone bought it, try doing that with a 1996 GPU in 2004...
 
Last edited:
Not sure if GPU ever had longer life time, ability to keep value and longer period before a refresh than now.

Buying a GeForce 3 was much more a melting ice cube, 6 years old 2070 super play every game quite well and will continue to do so for popular AAA game until PS6 only game show up, around a decade later someone bought it, try doing that with a 1996 GPU in 2004...
I agree. GPUs more than ever have had staying power where you didn't need to upgrade.
 
Not sure if GPU ever had longer life time, ability to keep value and longer period before a refresh than now.

Buying a GeForce 3 was much more a melting ice cube, 6 years old 2070 super play every game quite well and will continue to do so for popular AAA game until PS6 only game show up, around a decade later someone bought it, try doing that with a 1996 GPU in 2004...
Funnily enough, I gamed on a GeForce 3 Ti200 for years until a good deal on a 9800 Pro came along. This goes back to the split market I was talking about in an earlier post. Some people don’t see the point of cards being any faster because they can already max out their displays with room to spare, while the VR and 4k 120+ Hz crowd can’t get a fast enough card to run max settings at any price, and even the 5090 won’t change that.
 
Not sure if GPU ever had longer life time, ability to keep value and longer period before a refresh than now.

Buying a GeForce 3 was much more a melting ice cube, 6 years old 2070 super play every game quite well and will continue to do so for popular AAA game until PS6 only game show up, around a decade later someone bought it, try doing that with a 1996 GPU in 2004...

Although i agree that this is the case, it´s not so simple.

In the Geforce3 era the jump in tech/quality was MASSIVE. I remember the jump from Quake 3 to Doom3. Doom 3 was unbelivable heavy, like, 3-4 fps with overclock.

It simply erected a wall that no current gpu at the time could surpass. It took YEARS to it be playable in a home pc.

Today ? The evolution is much more constant but relying in existing tech, incremental updates (better shaders, increased shadows maps etc)

It´s like Raytrace and Raster shadows. RT today is the main drag in GPU evolution.

Raster shadows is as good as the RT with almost no cost, but it´s not trustworthy/correct as the Raytraced one, which are VERY costly.

Man i´m old. I remember the jump from (maybe not entirely time correct):
Doom -> Duke3D - HOLY S***
Duke3D -> Quake 1 - HOLY S***
Quake1 -> Unreal - HOLY S***
Unreal -> Doom3 - HOLY S***
Doom3 -> FarCry - HOLY S***
FarCry -> BF1 - HOLY S***
BF1 to BF2 - HOLY S***
BF2 to Crysis - HOLY S***
Then, something happened, maybe its i getting older, but maybe it´s the "diminishing returns" becoming apparent !?
Crysis to COD MW2 - Pretty good
MW2 to BF3 - Pretty good
BF3 to modern Unreal Engine - Pretty good
UE4/5 to Cyberfunky - Pretty good

And then VR came and..... broken me ?!, monitor/2D/Flat gaming is not exciting as it was.

Playing FO4VR (or any open world game, "VRable" enough) gives a whole new sensory deluge that no image quality stuck in a screen can match.

In HalfLife Alyx i spent hours just appraising the fine details in the weapons, gloves and the world itself. It´s hard to describe, really.

Anyway, the 5090 can´t come soon enough, i hope it bring an uplife of at least 80% in VR perf.
 
Doom 3 was unbelivable heavy, like, 3-4 fps with overclock.

It simply erected a wall that no current gpu at the time could surpass. It took YEARS to it be playable in a home pc.
That not how I remember at all, 9700 pro (august 2002)-5950 ultra (october 2003) had launched before Doom 3 and could play it on launch ( I did play Doom 3 on launch on my pc), Doom 3 was August 2004, maybe some leaked pre-built ? Carmark-ID software are not really known to make heavy to run titile, Doom 3 was running on the original 2001 Xbox with its custom Geforce 3.

More mid level GPU like the 6600GT a year later in 2005 could play it perfectly fine:

doom3performance.png


A year felt probably much longer then versus now.

But yes, that mostly the reason, GPU and video game exponential growth slowed down, all easy fruit were grabbed, now it is incredibly hard and costly to make anything better with diminishing return.

GPU now have more not playing the latest game use case than before, the library of old game is obviously much bigger, decoding-encoding platform, crypto for a while now AI workload of many kind....
 
Last edited:
That not how I remember at all, 9700 pro (august 2002)-5950 ultra (october 2003) had launched before Doom 3 and could play it on launch ( I did play Doom 3 on launch on my pc), Doom 3 was August 2004, maybe some leaked pre-built ? Carmark-ID software are not really known to make heavy to run titile, Doom 3 was running on the original 2001 Geforce 3 Xbox.

More mid level GPU like the 6600GT a year later in 2005 could play it perfectly fine:

View attachment 652950

A year felt probably much longer then versus now.

But yes, that mostly the reason, GPU and video game exponential growth slowed down, all easy fruit were grabbed, now it is incredibly hard and costly to make anything better with diminishing return.

GPU now have more not playing the latest game use case than before, the library of old game is obviously much bigger, decoding-encoding platform, crypto for a while now AI workload of many kind....

You are correct about the cards, but to me, those were impossible to get in the time.

IIRC i had a Geforce 2 GTS at the time !
 
Although i agree that this is the case, it´s not so simple.

In the Geforce3 era the jump in tech/quality was MASSIVE. I remember the jump from Quake 3 to Doom3. Doom 3 was unbelivable heavy, like, 3-4 fps with overclock.

It simply erected a wall that no current gpu at the time could surpass. It took YEARS to it be playable in a home pc.

Today ? The evolution is much more constant but relying in existing tech, incremental updates (better shaders, increased shadows maps etc)

It´s like Raytrace and Raster shadows. RT today is the main drag in GPU evolution.

Raster shadows is as good as the RT with almost no cost, but it´s not trustworthy/correct as the Raytraced one, which are VERY costly.

Man i´m old. I remember the jump from (maybe not entirely time correct):
Doom -> Duke3D - HOLY S***
Duke3D -> Quake 1 - HOLY S***
Quake1 -> Unreal - HOLY S***
Unreal -> Doom3 - HOLY S***
Doom3 -> FarCry - HOLY S***
FarCry -> BF1 - HOLY S***
BF1 to BF2 - HOLY S***
BF2 to Crysis - HOLY S***
Then, something happened, maybe its i getting older, but maybe it´s the "diminishing returns" becoming apparent !?
Crysis to COD MW2 - Pretty good
MW2 to BF3 - Pretty good
BF3 to modern Unreal Engine - Pretty good
UE4/5 to Cyberfunky - Pretty good

And then VR came and..... broken me ?!, monitor/2D/Flat gaming is not exciting as it was.

Playing FO4VR (or any open world game, "VRable" enough) gives a whole new sensory deluge that no image quality stuck in a screen can match.

In HalfLife Alyx i spent hours just appraising the fine details in the weapons, gloves and the world itself. It´s hard to describe, really.

Anyway, the 5090 can´t come soon enough, i hope it bring an uplife of at least 80% in VR perf.
I'd drain my bank account to play through half life alyx max settings on a VR headset that's 4k per eye with MicroLED
 
Although i agree that this is the case, it´s not so simple.

In the Geforce3 era the jump in tech/quality was MASSIVE. I remember the jump from Quake 3 to Doom3. Doom 3 was unbelivable heavy, like, 3-4 fps with overclock.

It simply erected a wall that no current gpu at the time could surpass. It took YEARS to it be playable in a home pc.

Today ? The evolution is much more constant but relying in existing tech, incremental updates (better shaders, increased shadows maps etc)

It´s like Raytrace and Raster shadows. RT today is the main drag in GPU evolution.

Raster shadows is as good as the RT with almost no cost, but it´s not trustworthy/correct as the Raytraced one, which are VERY costly.

Man i´m old. I remember the jump from (maybe not entirely time correct):
Doom -> Duke3D - HOLY S***
Duke3D -> Quake 1 - HOLY S***
Quake1 -> Unreal - HOLY S***
Unreal -> Doom3 - HOLY S***
Doom3 -> FarCry - HOLY S***
FarCry -> BF1 - HOLY S***
BF1 to BF2 - HOLY S***
BF2 to Crysis - HOLY S***
Then, something happened, maybe its i getting older, but maybe it´s the "diminishing returns" becoming apparent !?
Crysis to COD MW2 - Pretty good
MW2 to BF3 - Pretty good
BF3 to modern Unreal Engine - Pretty good
UE4/5 to Cyberfunky - Pretty good

And then VR came and..... broken me ?!, monitor/2D/Flat gaming is not exciting as it was.

Playing FO4VR (or any open world game, "VRable" enough) gives a whole new sensory deluge that no image quality stuck in a screen can match.

In HalfLife Alyx i spent hours just appraising the fine details in the weapons, gloves and the world itself. It´s hard to describe, really.

Anyway, the 5090 can´t come soon enough, i hope it bring an uplife of at least 80% in VR perf.

I had an ATi Radeon 9600XT back then and it really struggled with the Doom 3 and Far Cry at 640x480 resolution on a CRT monitor. Only Half Life 2 ran decently.

It was only when I got a Geforce 7600GT a few years later that I was able to finally play Doom 3 and Far Cry properly and even on a higher resolution 1024x768 LCD.
 
I'd drain my bank account to play through half life alyx max settings on a VR headset that's 4k per eye with MicroLED

I believe it's coming before 2030. Pimax Crystal Super is looking VERY interesting at 3840x3840 (QLED alas)

I had an ATi Radeon 9600XT back then and it really struggled with the Doom 3 and Far Cry at 640x480 resolution on a CRT monitor. Only Half Life 2 ran decently.

It was only when I got a Geforce 7600GT a few years later that I was able to finally play Doom 3 and Far Cry properly and even on a higher resolution 1024x768 LCD.
i know right !? Doom3 was a brick to run.
 
Back
Top