Scalpers are struggling to sell the RTX 4080 above MSRP, but retailers won't let them return the cards

If that's the case, mission accomplished for Nvidia. They've moved all of their old stock.

Hopefully. Maybe we will see a decent price cut on the 4090/4080.

I searched for the 3070 on newegg, only 4 models in stock. Looks like ASUS has a Noctua cooler or fan design:

14-126-545-V01.jpg



I like the Noctua fan I've had in my PC for a good 9 or so years now, but that color scheme is still ugly. Was wondering when someone would work with Noctua though.
 
The local Micro Center's been having open box 4080s pop up now and then; no prizes for guessing what's happening there.

Me, I'm not bothering with that crap and am aiming for the RX 7900 largely out of spite for all the price gouging. It might not be a clean sweep of the 4080 in rasterization, but if it provides comparable performance for $200-300 less, especially in VR, that'll be my long-overdue desktop GPU upgrade.
 
The local Micro Center's been having open box 4080s pop up now and then; no prizes for guessing what's happening there.

Me, I'm not bothering with that crap and am aiming for the RX 7900 largely out of spite for all the price gouging. It might not be a clean sweep of the 4080 in rasterization, but if it provides comparable performance for $200-300 less, especially in VR, that'll be my long-overdue desktop GPU upgrade.
I hate to be the bearer of bad news, but:

772487_1669830363304.png
 
Is inventory for the 3080/3090 thinning out? Wondering if the November shopping spree put a dent in the inventory. Still have Christmas around the corner.
Euro-Pound rebounded a bit versus october rate if it is in those market, maybe the price stayed pretty much the same in USD:
  • Sep 0.991832 – 30 days
  • Oct 0.983173 – 31 days
  • Nov 1.020446 – 30 days
  • Dec 1.050526 – 9 days
Pound
  • Sep 1.133866 – 30 days
  • Oct 1.128302 – 31 days
  • Nov 1.173461 – 30 days
  • Dec 1.221973 – 9 days
 
Nvidia features are key.

Rtx voice, nvenc, being able to play with the emerging GPT based code, it’s all something AMD needs to catch up for their pricing to make sense to me.
Not to mention raytracing perf (who buys a $1000+ card only to have to ignore the eye candy?) and Dlss 2 (on over 250 games) + dlss 3 frame gen (picking up steam quickly).

Nvenc is huge for gamers (I never really made many videos pre-shadowplay, but I do nowadays make clips of mmo stuff).

Cuda is a great thing for those of us who do more than just use their pc as a gaming rig too. I know pure gamers like to pretend it doesn't matter, but it does for a lot of people.
 
Time to make a bold statement: Stop Biting The Hook.

I'm watching reviews of these cards as people pretend to shit their pants with awe watching footage of 5+ year old games (or even 2+ year old games).....trying to tell me that spending $1500 so I can see better (but still not anywhere near perfect) PUDDLE REFLECTIONS is somehow supposed to transform my gaming experience.

They said the same thing to people with the 2000 series cards, and they ALL got universally burned. Then the 3000 series of Unobtainium-laced chips, guess what.....they can do it...so long as they drop their resolutions down and let "AI" (cough) make it all look Faux-K again!

They are now showing the 4000 cards which do it faster, but at half the power draw......which is great, but $1000+.

And you're still playing the same GD games.
I've seen so many reviews of games where the shitty-version is supposed to look bad compared to the awesome version............and I guess I'm just getting old but it's like "Why are you trying to feed me a Bullshit Sandwich? BOTH VERSIONS LOOK GREAT! I GREW UP ON PONG AND WIZARD OF WOR GOD DAMNIT, WE HAD ONE BUTTON...*ONE*...AND WE *LIKED IT*..." etc, etc........

Buy a $500 console, sit 7 feet away.....enjoy the game and don't worry about what you are missing, because you're missing 'extra frosting' on an already highly frosted sheetcake o' fun.
 
kronk-point.gif

Time to make a bold statement: Stop Biting The Hook.

I'm watching reviews of these cards as people pretend to shit their pants with awe watching footage of 5+ year old games (or even 2+ year old games).....trying to tell me that spending $1500 so I can see better (but still not anywhere near perfect) PUDDLE REFLECTIONS is somehow supposed to transform my gaming experience.

They said the same thing to people with the 2000 series cards, and they ALL got universally burned. Then the 3000 series of Unobtainium-laced chips, guess what.....they can do it...so long as they drop their resolutions down and let "AI" (cough) make it all look Faux-K again!

They are now showing the 4000 cards which do it faster, but at half the power draw......which is great, but $1000+.

And you're still playing the same GD games.
I've seen so many reviews of games where the shitty-version is supposed to look bad compared to the awesome version............and I guess I'm just getting old but it's like "Why are you trying to feed me a Bullshit Sandwich? BOTH VERSIONS LOOK GREAT! I GREW UP ON PONG AND WIZARD OF WOR GOD DAMNIT, WE HAD ONE BUTTON...*ONE*...AND WE *LIKED IT*..." etc, etc........

Buy a $500 console, sit 7 feet away.....enjoy the game and don't worry about what you are missing, because you're missing 'extra frosting' on an already highly frosted sheetcake o' fun.
 
Time to make a bold statement: Stop Biting The Hook.

I'm watching reviews of these cards as people pretend to shit their pants with awe watching footage of 5+ year old games (or even 2+ year old games).....trying to tell me that spending $1500 so I can see better (but still not anywhere near perfect) PUDDLE REFLECTIONS is somehow supposed to transform my gaming experience.

They said the same thing to people with the 2000 series cards, and they ALL got universally burned. Then the 3000 series of Unobtainium-laced chips, guess what.....they can do it...so long as they drop their resolutions down and let "AI" (cough) make it all look Faux-K again!

They are now showing the 4000 cards which do it faster, but at half the power draw......which is great, but $1000+.

And you're still playing the same GD games.
I've seen so many reviews of games where the shitty-version is supposed to look bad compared to the awesome version............and I guess I'm just getting old but it's like "Why are you trying to feed me a Bullshit Sandwich? BOTH VERSIONS LOOK GREAT! I GREW UP ON PONG AND WIZARD OF WOR GOD DAMNIT, WE HAD ONE BUTTON...*ONE*...AND WE *LIKED IT*..." etc, etc........

Buy a $500 console, sit 7 feet away.....enjoy the game and don't worry about what you are missing, because you're missing 'extra frosting' on an already highly frosted sheetcake o' fun.
Depends if you have the hardware to support it. I enjoy the higher framerates these GPU's provide me as I have a high hz high resolution display. Sure, you really only need like a 1660 Super if you only had a 1080p 60hz display - But some of us enjoy using better display technology for the obvious visual improvements.

We have not yet reached the point where the GPU's are fast enough to fill the gap of having a display that is high enough refresh rate and high enough resolution (And really pixel density, which matters. The difference in quality between a 27'' 1440 and 4k display is more than you'd think.) to where visual quality becomes similar to audio on computers. I would generally say a 4090 is nearing that point though, if you remove the raytracing requirements and look at raster only.
 
Not saying chasing this stuff is bad, or going high refresh rate when you are a KBM user doesn't make sense, all I'm saying is we've had 3 generations of "next gen" video card hardware that keeps getting more expensive and yet the software barely looks different when you turn on all the fancy visual effects you just paid $1600+ for. Seems like a rip, or like we're spending more for very little bang.
 
I'm on 1440p165hz, I picked up used evga xc3 rtx 3080 for $525 2 year warranty.

I am going play around with some games and check it out.

I'm seriously thinking about getting 4k once again.
 
Not saying chasing this stuff is bad, or going high refresh rate when you are a KBM user doesn't make sense, all I'm saying is we've had 3 generations of "next gen" video card hardware that keeps getting more expensive and yet the software barely looks different when you turn on all the fancy visual effects you just paid $1600+ for. Seems like a rip, or like we're spending more for very little bang.

I think one "Issue" is how big the disparity in game budget became and how long they take time, i.e. Red Dead Redemption 2 is 4 year's old and I imagine many game that need a 3090TI to play at the highest setting does not look better in some ways than Red Dead on PS4 from 2013, but that was a multi 9 figures affair.

And obvioustly it will be a log and the jump from Turok 2 in 1998 to Splinter Cell in 2002 will not happen, but istn't this a step above the call of duty circa 2016-2017 with playable setting on a 1080TI:



it is getting close to someone looking too fast at certain part to mistake it with an badly compressed actual footage.

It is a lot of dollars vs what it look like on a $400 PS5 digital edition for sure, Callisto protocol also a certain look to it that (and I could be wrong) make it feel like if it must be more recent than 4 year's old.

That impression of paying a lot for little could stop a bit if we stop feeling like we need to increase resolution and fps (and I imagine we will keep GPU and will consider it top of the line for a very long time when it happens, like a console generation type of lifetime), someone ok with 1080p/75/medium his still quite ok with a 1080TI:
https://www.techspot.com/review/252... graphs.-,Benchmarks,slower than the RTX 3060.

The wanting the 4k native instead of the excellent upscaler we have and the wanting of the hardware to run it is part of the experience, the journey, the fun of it, not necessarily giving an of the journey gaming experience that is subjectively better than Final Fantasy 3 on a basement 20 inch CRT tv was in 1995
 
Not to mention raytracing perf (who buys a $1000+ card only to have to ignore the eye candy?) and Dlss 2 (on over 250 games) + dlss 3 frame gen (picking up steam quickly).

Nvenc is huge for gamers (I never really made many videos pre-shadowplay, but I do nowadays make clips of mmo stuff).

Cuda is a great thing for those of us who do more than just use their pc as a gaming rig too. I know pure gamers like to pretend it doesn't matter, but it does for a lot of people.
I’ve been playing with more creative applications, that’s a normal step for post lockdown wfh.
People don’t want to sit in meetings and have me narrate PowerPoint slide live.

I respect audio engineers much more bc trying to dial in your own voice sucks.

Adobe has been adding support for rtx core useage, the polished thing is adding fake 3d fx so basic graphics don’t look like clip art.
I need to do better than basic moving icons or annotations.
I barely understand how to get sprite sheets looking natural.

I’ve been using stable diffusion and chatgpt a lot lately.
Eventually I’m going to want to incorporate ML toy features in mobile apps.
Leveraging mobile soc gpu capabilities has been on my radar for a couple years now bc people are making $ with really simple implementations.

So yeah there’s a lot of daily life including trying to edit 30min walk thrus better.
My output level is no thumbnail, no intro, no outro, like how I’ve used moviemaker for a decade.
That could use some easy mode polish bc no reads instructions anymore.
 
I hate to be the bearer of bad news, but:

View attachment 533196
That's something I am quite painfully aware of right now, and it's not like I can wait for Intel to suddenly pull off a miracle with Arc Battlemage and undercut the established two players heavily while somehow not screwing up performance on the niche games I'm looking at, not with the sorry state their drivers are currently in.

But the frustrating part for me is that I'm still on a GTX 980 on desktop, and an RTX 2070 Max-Q on laptop that performs only like a desktop 2060, and actually much worse in practice when the thermal throttling hits and adds stutters that my desktops don't suffer from because they have adequate cooling, even when overclocked.

I think I'm long overdue for a new GPU right now to not horribly bottleneck that i7-12700K build I just set up. It's not that I want to spend $1,000, or hell, even $500 on a GPU, but that I'm effectively stuck having to spend ludicrous amounts to hit some high performance targets that I'm about to elaborate on below.

Time to make a bold statement: Stop Biting The Hook.

I'm watching reviews of these cards as people pretend to shit their pants with awe watching footage of 5+ year old games (or even 2+ year old games).....trying to tell me that spending $1500 so I can see better (but still not anywhere near perfect) PUDDLE REFLECTIONS is somehow supposed to transform my gaming experience.

They said the same thing to people with the 2000 series cards, and they ALL got universally burned. Then the 3000 series of Unobtainium-laced chips, guess what.....they can do it...so long as they drop their resolutions down and let "AI" (cough) make it all look Faux-K again!

They are now showing the 4000 cards which do it faster, but at half the power draw......which is great, but $1000+.

And you're still playing the same GD games.
I've seen so many reviews of games where the shitty-version is supposed to look bad compared to the awesome version............and I guess I'm just getting old but it's like "Why are you trying to feed me a Bullshit Sandwich? BOTH VERSIONS LOOK GREAT! I GREW UP ON PONG AND WIZARD OF WOR GOD DAMNIT, WE HAD ONE BUTTON...*ONE*...AND WE *LIKED IT*..." etc, etc........

Buy a $500 console, sit 7 feet away.....enjoy the game and don't worry about what you are missing, because you're missing 'extra frosting' on an already highly frosted sheetcake o' fun.
I get what you're saying, but here's the problem I'm facing, which has absolutely nothing to do with ray-tracing reflections:

VR is ludicrously GPU-intensive and some developers haven't optimized their shit. (Don't even suggest DLSS, as that doesn't even work in VR games - well, 2.1 might if devs bother with it, but 3.0 fundamentally can't.)

Go on, look up DCS World and especially MS Flight Simulator 2020 benchmarks. The latter just barely manages 45 FPS average with an RTX 4090, when VR calls for a consistent minimum of 80 FPS if not more, depending on your HMD's refresh rate. Thought 4K 60 Hz was an unrealistic performance target? High-end VR HMDs exceed that handily, and most of these flight sims are using old, irritatingly unoptimized engines that use maybe two of your CPU cores tops while also being way behind on 3D APIs and the efficiency improvements more modern ones can offer.

No Man's Sky apparently doesn't fare much better, which is especially bizarre because that game easily gets manhandled at 1080p with a mere GTX 980. Put it in VR, though, and you'll be wishing for at least an RTX 3090 in short order just to maintain some semblance of smoothness at the same in-game detail.

Yeah, I know, those are more old games that have been inefficiently retrofitted to use a new interface, and barring NMS, none of them are going to be ported to that PS5 I've got sitting in my bedroom, let alone with PSVR2 support.

But the sense of presence and scale I get in VR is such a damn game-changer, the thing that feels truly next-gen to me more than raytracing ever did, that I find it hard to enjoy cockpit sims on a flat screen now. That's why I'm trying to scrounge up as much as I can to get a GPU that can do it justice, so I'm not constantly enduring reprojection warping all over that cockpit and headache-inducing frame drops when it can't even manage that.

Until then, it's optimized stuff like GORN, COMPOUND, VTOL VR and Vox Machinae for me... maybe even Elite: Dangerous too, but that one does push the GTX 980 a little harder than I'd like.
 
That's something I am quite painfully aware of right now, and it's not like I can wait for Intel to suddenly pull off a miracle with Arc Battlemage and undercut the established two players heavily while somehow not screwing up performance on the niche games I'm looking at, not with the sorry state their drivers are currently in.

But the frustrating part for me is that I'm still on a GTX 980 on desktop, and an RTX 2070 Max-Q on laptop that performs only like a desktop 2060, and actually much worse in practice when the thermal throttling hits and adds stutters that my desktops don't suffer from because they have adequate cooling, even when overclocked.

I think I'm long overdue for a new GPU right now to not horribly bottleneck that i7-12700K build I just set up. It's not that I want to spend $1,000, or hell, even $500 on a GPU, but that I'm effectively stuck having to spend ludicrous amounts to hit some high performance targets that I'm about to elaborate on below.


I get what you're saying, but here's the problem I'm facing, which has absolutely nothing to do with ray-tracing reflections:

VR is ludicrously GPU-intensive and some developers haven't optimized their shit. (Don't even suggest DLSS, as that doesn't even work in VR games - well, 2.1 might if devs bother with it, but 3.0 fundamentally can't.)

Go on, look up DCS World and especially MS Flight Simulator 2020 benchmarks. The latter just barely manages 45 FPS average with an RTX 4090, when VR calls for a consistent minimum of 80 FPS if not more, depending on your HMD's refresh rate. Thought 4K 60 Hz was an unrealistic performance target? High-end VR HMDs exceed that handily, and most of these flight sims are using old, irritatingly unoptimized engines that use maybe two of your CPU cores tops while also being way behind on 3D APIs and the efficiency improvements more modern ones can offer.

No Man's Sky apparently doesn't fare much better, which is especially bizarre because that game easily gets manhandled at 1080p with a mere GTX 980. Put it in VR, though, and you'll be wishing for at least an RTX 3090 in short order just to maintain some semblance of smoothness at the same in-game detail.

Yeah, I know, those are more old games that have been inefficiently retrofitted to use a new interface, and barring NMS, none of them are going to be ported to that PS5 I've got sitting in my bedroom, let alone with PSVR2 support.

But the sense of presence and scale I get in VR is such a damn game-changer, the thing that feels truly next-gen to me more than raytracing ever did, that I find it hard to enjoy cockpit sims on a flat screen now. That's why I'm trying to scrounge up as much as I can to get a GPU that can do it justice, so I'm not constantly enduring reprojection warping all over that cockpit and headache-inducing frame drops when it can't even manage that.

Until then, it's optimized stuff like GORN, COMPOUND, VTOL VR and Vox Machinae for me... maybe even Elite: Dangerous too, but that one does push the GTX 980 a little harder than I'd like.

Then you are in the right spot to hit the Q1 releases.
Spending should tank right when CES 2023 kicks off.
A lot of the 20 and 30 something gamers in my area are getting a free $350 from the State that’s sucking up used 3070 or 5800x3d.
 
I feel you on the VR, I was a Rift CV1 early adopter and currently I'm an owner of a Quest 2 before the price hike hit them......VR games (and Doom, cuz you gotta KBM that shit) are the only ones left o my PC, but a 1080 Ti doesn't cut the mustard for lots of those games because, well, you know about the whole spacewarp/reprojection thing, its there but not ideal. DCS World is literally the dream of all fast-mover pilots and pilots-at-heart......Xwing vs Tie Fighter made the 8 year old in me squeeee but with a 1080 Ti it's like playing a mobile game, so I get it.

But I don't game on any of these enough to justify a $2 grand drop. I'm now glad I held onto my pesos during the 20 series and 30 series craze because those boards didn't seem to give anyone much of an uplift......you could probably pound out decent frames on a 3090 or 3090Ti in triple monitor setups but VR is just too niche for anyone to really exploit the optimization to get stuff really running well.
 
From a gamer point of view 7900XTX is better than 4080 and 7900XT is little less, but the difference in perfomance between XT and XTX is important (around 25%). So that's some hundreds of dollars cheaper.
However for professional work, because Cuda is more supported and because there is no Optix equivalent on AMD, and because it seems that AMD GPU after GCN are not optimized for massive parallel processing anymore but for gaming.
To stay in touch with gamers nvidia needs to put those prices :
4090 : whatever they want
4090Ti 48GB GDDR6X : even much more
4080 : $1000
4070ti 12GB : $700 (previously 4080 12GB at $900)
4070 12GB : $500
If they don't they will loose market share on gamers, for sure.
 
Will see the actual price of the XT and XTX people can actually buy, because I feel that should push the 4080-4070TI pricing.

Nvidia seem to be able to always have worst raw performance by dollars deals by quite the margin, so maybe the 4080 could sell at that $1000 tag even if it is possible to buy the stronger XTX instead at the price.

But maybe in reality the xtx will pretty much be an $1100 card and the Xt so rare and often push around $970, that it would open the door to keep the price really high.

Will be hard for AMD/Nvidia to let go of the established extremely high 2017-2022 pricing (and for the consummer giant media environnement), I am not sure what a $500-600 card would look like today, with say still a 40% profit margin for Nvidia-Amd and an interesting one for the partners, but I think it would be extremely badly received has an high end card.
 
Last edited:
I feel you on the VR, I was a Rift CV1 early adopter and currently I'm an owner of a Quest 2 before the price hike hit them......VR games (and Doom, cuz you gotta KBM that shit) are the only ones left o my PC, but a 1080 Ti doesn't cut the mustard for lots of those games because, well, you know about the whole spacewarp/reprojection thing, its there but not ideal. DCS World is literally the dream of all fast-mover pilots and pilots-at-heart......Xwing vs Tie Fighter made the 8 year old in me squeeee but with a 1080 Ti it's like playing a mobile game, so I get it.

But I don't game on any of these enough to justify a $2 grand drop. I'm now glad I held onto my pesos during the 20 series and 30 series craze because those boards didn't seem to give anyone much of an uplift......you could probably pound out decent frames on a 3090 or 3090Ti in triple monitor setups but VR is just too niche for anyone to really exploit the optimization to get stuff really running well.
We're much in the same boat, then. Was a CV1 early adopter myself, even got that GTX 980 just to ensure I'd meet the requirements on day one, but then VR games quickly outpaced the performance the 980 would've had to offer. Even the Oculus Home software post-2.0 update had serious issues with framedrops that I didn't have to put up with before.

And then I got a higher-resolution Valve Index full kit, which just made the gap worse and essentially forced me to run at 80 Hz rather than the maximum 144 Hz because I'd be in reprojection all the time otherwise.

I'd have been pretty envious of your 1080 Ti because if you disregard raytracing and DLSS, that GPU had some serious legs. Anyone who had the cash for one of those got some impressive mileage out of it, much like how I rocked the Q6600 and 4770K for so long on the CPU side of things. Kind of a shame that it's still not enough for the VR titles we're looking at, though; there's something clearly very, very wrong when a $1,600 RTX 4090 is considered entry-level for VR in certain games.

Honestly, I'm not surprised that the Quest 2 is now the dominant VR platform, much to the consternation of PCVR enthusiasts. The games on it, while mobile quality for obvious reasons, just work. People can optionally tether to a gaming PC through Oculus Link for PCVR and apparently get a surprisingly good experience out of it, but you're not required to have one, which saves a lot of money over getting a similarly-priced HP Reverb G2 on sale that needs a high-end PC, or worse, a significantly more expensive Valve Index full kit, or a Varjo Aero that makes the Index look like a budget HMD.

And now we have the PS5 + PSVR2 combo, for which the all-in cost is a little more than a Valve Index full kit by itself. It's not Quest 2 cheap, but it's still so much cheaper than getting a comparable PCVR experience that I think we'll be seeing a lot of developers pivot toward PSVR2 instead for games that can't run on Quest 2/3/Pro for technical reasons. Hell, even I'm starting to think that a PSVR2 may be a better buy than a new GPU at this rate, now that I've got the PS5 itself accounted for.

By the way, when did an old game like X-Wing vs. TIE Fighter get VR support? Weren't the cockpits still all 2D back then? Hell, sign me up for that! Not many older games get VR support grafted on after the fact, even if they have support for TrackIR or other head-trackers; Falcon BMS and IL-2 Sturmovik: 1946 certainly haven't had it happen as of yet. I'd be playing the hell out of those given my limited GPU performance and still having a blast, dated graphics be damned.
 
Honestly, I'm not surprised that the Quest 2 is now the dominant VR platform, much to the consternation of PCVR enthusiasts. The games on it, while mobile quality for obvious reasons, just work. People can optionally tether to a gaming PC through Oculus Link for PCVR and apparently get a surprisingly good experience out of it, but you're not required to have one, which saves a lot of money over getting a similarly-priced HP Reverb G2 on sale that needs a high-end PC, or worse, a significantly more expensive Valve Index full kit, or a Varjo Aero that makes the Index look like a budget HMD.

Way off topic, but Quest2 being subsidized hardware with near-Apple-perfect alignment of user experience, ecosystem, standalone/wireless and marketed to all ages - no doubt it became dominant in a field where everyone else seems barely trying comparatively. But could only be consternation to short-sighted PCVR dorks that don't understand VR is not zero-sum. It's been an interesting evolution for sure: the goldfish attention span of the collective internet ready to declare it dead at every moment, yet a slow burn of advancement continues quietly in spite of it - dissolving technical barriers and increasing fidelity because VR, like raytracing, was always the holy grail, and there's only forward.

Back to 4080/4090, a 4090 really is a big leap forward for bleeding edge PCVR. A year ago on an older Intel PC used to get 5-7ms frametimes with a Reverb G2 + RTX3090 in certain titles (VTOL VR was one) which seemed amazing. Fast forward to a 13700K+4090 and it's 0.9-1.4ms frametimes in some of those titles, and framerates on everything across the board more than doubled. It's absurd.

BeatSaber with your friends on a Quest2 is cool, but a 4090 + Pimax 8KX with 170° FOV + OpenXR Toolkit + DLSS all dialed in perfectly for a solid 90FPS in MSFS, or flying low through canyons in an F16 in DCS, or roaming around in the most comprehensive VR title in existence called SkyrimVR, or walking in Nightcity at street level in Cyberpunk 2077 with the Luke Ross mod and it's whole 'nuther level. Everything else will catch up eventually, but it exists now for anyone just enthusiastic or at least patient enough. I'm probably most excited about the Universal Unreal Engine VR Injector Mod in development, which already has a big list of confirmed titles, but potential to open thousands of additional PC games including hundreds of AAA's.
Alien Isolation
Bendy and the Ink Machine
Blade and Sorcery (The Outer Rim)
Crash Bandicoot
Cyberpunk 2077
Deep Rock Galactic
Devil May Cry 5
Doom 3 BFG
Dying Light
Elden Ring
Elder Scrolls Morrowind
Elder Scrolls Skyrim (Skyblivion)
Everyplace 2
Fallout 4
Final Fantasy 7
Firewatch
Gary's Mod
Ghostwire Tokyo
Grand Theft Auto V
Grounded
GTFO
Gunfire Reborn
Half Life (1&2)
Horizon Zero Dawn
Left 4 Dead 2
The Legend of Zelda: Ocarina of Time
Life is Strange: Before the Storm
Little Nightmares 2
Mafia (1&2)
Minecraft Dungeons
Monster Hunter Rise
Neon White
No One Lives Forever 2
Outer Wilds
Poppy Playtime
Pray for the Gods (SotC mod)
Raft
Red Dead Redemption 2
Resident Evil (,2,3,7,8)
Risk of Rain 2
Saints Row
Saints Row 3
Scorn
Session: Skate Sim
The Stanley Parable
Star Wars Dark Forces
Star Wars Jedi: Fallen Order
Star Wars Jedi Knight: Jedi Academy
Star Wars Jedi Knight 2: Jedi Outcast
Star Wars Squadrons
Star Wars TIE Fighter
Star Wars X-Wing Alliance
Stray
System Shock (remake)
Tony Hawk Remake (1&2)
Valheim
 
Last edited:
The absolutely massive barrier to entry for high end VR is what will hold it back. For that alone I’m thankful for things like the Quest- it keeps feeding the need for VR titles while being approachable and affordable.

I still run on a CV1, with a 3070 feeding it - I know upgrading would be nice, but also doing a 4090… stops being sane, when you add in 2k headsets and the like!
 
The absolutely massive barrier to entry for high end VR is what will hold it back. For that alone I’m thankful for things like the Quest- it keeps feeding the need for VR titles while being approachable and affordable.

I still run on a CV1, with a 3070 feeding it - I know upgrading would be nice, but also doing a 4090… stops being sane, when you add in 2k headsets and the like!
Meta's cornering of the market is making VR games being tailored to the lowest common denominator, which is the Quest headset operating standalone. And these games look like they were made for the Wii. Because of Meta's hold on to the market, there is no push for "hardcore" games in the market anymore. VR is going to be synonymous with the mobile gaming market thanks to Meta.
 
Meta's cornering of the market is making VR games being tailored to the lowest common denominator, which is the Quest headset operating standalone. And these games look like they were made for the Wii. Because of Meta's hold on to the market, there is no push for "hardcore" games in the market anymore. VR is going to be synonymous with the mobile gaming market thanks to Meta.
That was going to happen anyway. The high end PC VR sets where never going to sell in the numbers required to fuel development. The economics just don't work. Its the same reason we don't see more then a handful of non VR games with heavy requirements. Even before Meta VR games where getting dumbed down to wii like titles.
The unfortunate happening imo was the original oculus push. People bought in as soon as the screens got somewhat reasonable. The tech to mass produce decently priced high res high refresh sets and the GPU to push them just wasn't where it needed to be at that time. They sucked up all the investment... and got a handful of developers to splash out some investment. It is going to make it harder to have a real push in 5-10 years when the hardware will catch up to the idea. imo Too many money people got burned, which means they won't be going back in... and others will remember their failure.

It sucks but if the tech is going to take off now... we need GPUs on the low end to get to around where the 4090 is today (because 1% lows in VR make too many people puke). Once the hardware is there its going to take a Meta or an Apple, someone with deep pockets to stick their necks out and loose money for awhile. My opinion for what its worth. :)
 
Meta's cornering of the market is making VR games being tailored to the lowest common denominator, which is the Quest headset operating standalone. And these games look like they were made for the Wii. Because of Meta's hold on to the market, there is no push for "hardcore" games in the market anymore. VR is going to be synonymous with the mobile gaming market thanks to Meta.
Then make a headset that doesn’t require a 5k investment to use. Not sure how else to fix that problem. You’re looking at a niche of a niche market.

Doesn’t help that outside of simulators, most VR games have been mediocre at best. I enjoyed robo recall. Alyx got boring after about 4 hours. Lone Echo had the same problem as Alyx - never even bothered with the sequel. Not sure what else there even is - I play beat Sabre and occasionally replay recall. Oh. And the x wing game.
 
That was going to happen anyway. The high end PC VR sets where never going to sell in the numbers required to fuel development. The economics just don't work. Its the same reason we don't see more then a handful of non VR games with heavy requirements. Even before Meta VR games where getting dumbed down to wii like titles.
The unfortunate happening imo was the original oculus push. People bought in as soon as the screens got somewhat reasonable. The tech to mass produce decently priced high res high refresh sets and the GPU to push them just wasn't where it needed to be at that time. They sucked up all the investment... and got a handful of developers to splash out some investment. It is going to make it harder to have a real push in 5-10 years when the hardware will catch up to the idea. imo Too many money people got burned, which means they won't be going back in... and others will remember their failure.

It sucks but if the tech is going to take off now... we need GPUs on the low end to get to around where the 4090 is today (because 1% lows in VR make too many people puke). Once the hardware is there its going to take a Meta or an Apple, someone with deep pockets to stick their necks out and loose money for awhile. My opinion for what its worth. :)
Agreed. When not having enough horsepower is literally vomit inducing, and you’re back to tweaking settings constantly to make it playable, the average consumer checks out. That’s the magic of the quest. It just works. Anywhere. Any time.

And with airplay you can use it as a display for a real game - just at a lower res than an Aero or the like.
 
  • Like
Reactions: ChadD
like this
Then make a headset that doesn’t require a 5k investment to use. Not sure how else to fix that problem. You’re looking at a niche of a niche market.

Doesn’t help that outside of simulators, most VR games have been mediocre at best. I enjoyed robo recall. Alyx got boring after about 4 hours. Lone Echo had the same problem as Alyx - never even bothered with the sequel. Not sure what else there even is - I play beat Sabre and occasionally replay recall. Oh. And the x wing game.
Squadrons makes me puke so very hard in VR. Elite Dangerous is dope though no problems there.
 
Anyone remember the GTX 280? Yeah I paid $650 for it
GeForce GTX 980 TI and Radeon Fury X both also launched for $650 I believe. I think GeForce 6800 Ultra was $600?

I know there's inflation and other causes, but I miss when flagships were around $400, like the Radeon 9700 Pro, which was like $400-$450. Shiat, I used to complain about that price.

Looks like ASUS has a Noctua cooler or fan design...Was wondering when someone would work with Noctua though.
https://www.thefpsreview.com/2021/0...ce-rtx-3070-graphics-card-with-noctua-cooler/

https://www.thefpsreview.com/2021/1...nounced-first-graphics-card-with-noctua-fans/

https://www.thefpsreview.com/2022/0...-3080-noctua-edition-graphics-card-announced/
 
Now that I've seen some tests on the 4080 including CUDA, Optix and RT in gaming compared to the AMD 7900XTX, I underestimated the card. It's only a little slower on raster in gaming but it's much faster than AMD when using RT. AMD makes no use for their matrix cores in RDNA3 for FSR like Nvidia for DLSS, and parallel computation on CUDA or OpenCL beats the 7900XTX with no problem but then Optix leaves the AMD card completely in the dust. AMD could develop something for the devs to make use of the RT cores but it's not even here yet and Nvidia is on its third architecture all with software development to use it.
So the 7900XTX has that raster performance for the gaming without RT and also 24GB of Vram which is of no use in gaming but in professional parallel computing where it is weaker than competition though with less VRAM (aka Nvidia).
So the value of the 7900XTX at $1000 is not so great compared to $1200 for the 4080. It's probably well balanced.
 
Back
Top