Nvidia has got us.

I was recently looking for a decent FHD card, and to my surprise nVIDIA is nowhere to be found in the ~$300 segment, my choice was between Intel and AMD, because what nvidia sells at this price point offers laughable performance comparatively.
 
2K us half of 4k and dou le 1080p

Ok well there's 16k coming out so it's all vernacular I suppose
4K refers to horizontal resolution so by the same metric 1080p would be 2K ~2000 pixels horizontal resolution. And 1440p would then be 2.5K.
 
Better is a price/performance ratio. The hate right now aka 4060ti, is Nvidia shiting on that ratio compared to the last gen (or any gen)
I play VR and yes Nvidia for the most part is my only answer, but sadly until they can't seem in 2023 to get a 16GB card under $1000! They are loosing my $$. So I don't care if Nvidia are more mature and versatile as you say, the price/performance is not mature or versatile.
Well I mean the A770 16GB card is actually shaping up pretty well at this stage, with the latest drivers it handily beats out or at least ties the 7600, but I don't know if it works with Steam VR yet, it didn't 4 months ago but was working pretty well with the other platforms.
Could be worth a look for you though.
 
Yeah I'mma just keep buying Nvidia, still the better product with the best/most feature sets that lets me check the most tickboxes to 'on' in game settings and then runs with better efficiency

You do what's best for you and your situation

View attachment 575803
HUH? 50 watts is like $23 bucks a year but almost 40% more raw performance... LMAO.

0247aae0-9da4-4158-8897-defaa05bfe8e.jpg
 
HUH? 50 watts is like $23 bucks a year but almost 40% more raw performance... LMAO.

Better energy efficiency is indicative of a better engineered product

I use air cooling in my case so more electricity is more heat is more fan activity which is more noise and more pulling dust in - and then also additional electric cost as you mention are all also other negatives
 
About as much sense as comparing AMDs top of the line GPU with Nvidia's middle of the pack GPU.

View attachment 576146
Not that unfair, the 4070TI is only $50-100 CAD cheaper than the 7900xt.
So is the extra money and better raster performance worth the trade?
The 7900xt is actually a pretty fair comparison to the 4070ti
 
Not that unfair, the 4070TI is only $50-100 CAD cheaper than the 7900xt.
So is the extra money and better raster performance worth the trade?
The 7900xt is actually a pretty fair comparison to the 4070ti
While that line of thinking is fair in of itself, that comparison makes it appear is if the 7900XTX and 4070 ti MSRPs were always close. the 7900 XTX released at a $200 premium vs the 4070 ti. It is only after discounts that the 7900 XTX is even within $50-$100 now. And once the 4070 ti gets it's discounts, the scales will shift again. This is why we should be remembering what class a card is when comparing it to another.

Let's not even mention that his proof is bunk to begin with. Both cards are pretty close in terms of performance. Pretty even at 1440p and only 3% faster at 4K.

1686535103380.png
1686535121941.png
 
I’ve seen better performance with AMD depending on the game. AMD 6900xt vs 3080Ti, upon launch of FC6, the 3080ti was a total stutter fest with textures low res then popping in. The Ultra texture pack. That is with RT on. The 3090 was ok but was still slower than the 6900xt. Later it did not matter which card was used.

There are games each can do better than the other. Overall per price category I don’t see much difference and laugh at the single game points used. Control, AMD just flat out suck with RT. Game was done with DXR 1.0, AMD will suck most likely with any games using DXR 1.0.

First wow monents in a very long time is with the 7900 XTX with Dead Island 2. Santa Monica pier, OMG is it done so well with the lighting. Now the clown boss sucks but that is besides the point. AMD shines in this game, 4k HDR max setting, no FSR, 115FPS average with Chill on, to keep it in the FreeSync range. Gaming bliss. No dips, just smooth ass game play.

I would have picked up a 4090, maybe still will but the DP 1.4 will become limiting in the future. I keep cards in general, in use, usually 4+ years. By then I expect to have bought a couple of high end monitors and possibly VR headset needing DP2.1. Nvidia limiting one of the most important aspects, display outputs makes it less of a buy for me.

The 3080ti, since now sold, with the limited ram, I saw the usefulness diminished. Not a 4 year card.

As for Nvidia better feature set, to me it is more of an illusion than reality. How may RTX games will I really be able to play on my 3090? That is with RT on and have great consistent frame rate making the gaming experience better than with RT off? DLSS, nice but I’ve found limited in real use. The interface sucks ass with Nvidia, have to use other programs to OC decently. AMD I have way more versatility in game profiling, Chill on off, OC or UV settings, list goes on and on.

So far VR is very disappointing with the 7900 XTX. Will need to tinker more to see if it really sucks overall.
 
4K refers to horizontal resolution so by the same metric 1080p would be 2K ~2000 pixels horizontal resolution. And 1440p would then be 2.5K.
Well I'm using Alex Jones' lexicon - I'll just invent words as I go!
 
:ROFLMAO:
Well I mean the A770 16GB card is actually shaping up pretty well at this stage, with the latest drivers it handily beats out or at least ties the 7600, but I don't know if it works with Steam VR yet, it didn't 4 months ago but was working pretty well with the other platforms.
Could be worth a look for you though.
A770 does okay but then has some massive title's fall of. Avg. Bench shows a 7600 at +5%. Maybe @ 4K the A770 might have the edge due to RAM, but who's doing that? Would take a 7600 all day long. But not my point. Neither would touch my RTX2080 with its meager 8GB in VR.
I'm waiting in 2023 for Nvidia to offer a 16GB card for under $1000. Seems insane, no?
 
4K refers to horizontal resolution so by the same metric 1080p would be 2K ~2000 pixels horizontal resolution. And 1440p would then be 2.5K.
This is where naming gets fun, the following are all considered 2K by one standards industry or another
2K DCI - 2048x1080 (17:9)
2K HDTV - 2048x1152 (16:9)
2K Academy - 2048x1536 (4:3)
2K WQHD - 2560x1440 (16x9)

Really it comes down to aspect ratios but 2K resolution is really anything with fewer than 2000 vertical lines but more than 1000 so technically and legally you can refer to 1920x1080p displays as 2K as well.

The term 4K was made up and changed the naming convention from vertical lines to horizontal lines for marketing clarity in televisions to make it sound more impressive than the High Definition, Full High Definition, and Quad High Definition, TV displays that came before it, really following that naming convention what we call 4K TV's should be Quad Full High Definition, which as you can see is quite the mouth full.
 
For a mainstream user, the upgrade cycle is much larger and it averages around 3.0 - 3.5 years. So the vast majority of gamers upgrading to an RTX 4060 are going to be coming from either a GTX 1060 or RTX 2060 GPU and they are going to see a big upgrade in performance and feature set.

What a nice BS spin on the terrible 3060->4060 "upgrade".
I’ve got some friends using 1650s still that I would toss a $50 at if it convinced them to upgrade to anything better, because playing with them online is painful.
 
I’ve got some friends using 1650s still that I would toss a $50 at if it convinced them to upgrade to anything better, because playing with them online is painful.
Buy the 3060 12gb, 67x0 12gb or Arc 16gb

But for heaven's sake no one who keeps cards for 4 years or more should buy a gpu with less than 12gb vram in 2023
 
Buy the 3060 12gb, 67x0 12gb or Arc 16gb

But for heaven's sake no one who keeps cards for 4 years or more should buy a gpu with less than 12gb vram in 2023
They are still playing on 2GB in some cases and they are fine with it. Makes me cry inside, because playing anything online with them is a challenge. They spent more time loading D4 this weekend than playing so hopefully that drives it home that sure they get 30fps in potato mode but it’s not worth it…
Told them to hold out for the 4060 (non TI) launch and see what the market does in reaction to that.
 
Told them to hold out for the 4060 (non TI) launch and see what the market does in reaction to that.
Hopefully the $300 4060 will force AMD to do 2 things:

1. Discount 7600 8gb to $250
2. Release 7600 16gb for $330 - $350.

One can always hope.
 
This is where naming gets fun, the following are all considered 2K by one standards industry or another
2K DCI - 2048x1080 (17:9)
2K HDTV - 2048x1152 (16:9)
2K Academy - 2048x1536 (4:3)
2K WQHD - 2560x1440 (16x9)

Really it comes down to aspect ratios but 2K resolution is really anything with fewer than 2000 vertical lines but more than 1000 so technically and legally you can refer to 1920x1080p displays as 2K as well.

The term 4K was made up and changed the naming convention from vertical lines to horizontal lines for marketing clarity in televisions to make it sound more impressive than the High Definition, Full High Definition, and Quad High Definition, TV displays that came before it, really following that naming convention what we call 4K TV's should be Quad Full High Definition, which as you can see is quite the mouth full.
Before 4K and even after 4K for a while nobody mentioned 2K, it just didn't exist in the public sphere, Whether it existed as an obscure industry standard I don't know. It seems to me that 2K started popping up after 8K was already a thing, before that we spoke about 720p HD, 1080p FHD, 1440p QHD, or 4K. But when 8K was added to the discourse some started referring to anything bellow 4K as 2K.

It's all a mess really, since there are so many individual resolutions that can be squeezed into this 2K bracket it makes the term practically useless as you have to clarify anyway, so why not just go with the actual resolution? That's the only clear thing. I don't even know if even my own display is 4K or not with 3840x1600. It's not even standard widescreen 21:9, but 24:10, which even the manufacturer gets wrong on the product page as it is listed as a 21:9 monitor.
 
Thats getting better too quarter over quarter.
The last quarter is the first time it's gone up in more than a year. It's still far below their peak of 36% at the height of inflation and crypto.

1686573784792.png
 
If I am reading correctly the 3070 would not have been part of the $699+ and up sales, it is comparing 4070 and up to 3080 and up sales I think.
The cards such as the GeForce RTX 4070 and RTX 4070 Ti which start at $699 US have seen 40 percent faster revenue ramp compared to their Ampere predecessors.

The sale guy is looking at the price tag not the sku name it seem, the 3080 had really high demand but it was maybe quite low volume available, the 4070-4070ti-4080-4090 selling significantly more (in nvidia revenues, not units) than the 3080-3090 is possible.

I didn't read it that way, myself, I read it as them throwing the price in as a basis of comparison. In any case, the correct comparison to make isn't sticker price, it's product class. Sticker price will vary due to a variety of factors, chief among them being inflation (and margin protection). The 3070 class buyer will be a 4070 class buying all else considered equal even with moderate inflation factored in. That said, the "revenue ramp" is likely a function of the ASP, so from that standpoint, it's necessary to include price.

In any case, this article stating that their sales are up 40% doesn't jive with the behaviour I've seen at retail. You don't throw incentives at a product to grease sales when it's flying off the shelf. It just seems bizarre to me.
 
Before 4K and even after 4K for a while nobody mentioned 2K, it just didn't exist in the public sphere, Whether it existed as an obscure industry standard I don't know. It seems to me that 2K started popping up after 8K was already a thing, before that we spoke about 720p HD, 1080p FHD, 1440p QHD, or 4K. But when 8K was added to the discourse some started referring to anything bellow 4K as 2K.

It's all a mess really, since there are so many individual resolutions that can be squeezed into this 2K bracket it makes the term practically useless as you have to clarify anyway, so why not just go with the actual resolution? That's the only clear thing. I don't even know if even my own display is 4K or not with 3840x1600. It's not even standard widescreen 21:9, but 24:10, which even the manufacturer gets wrong on the product page as it is listed as a 21:9 monitor.
Technically yes it is 4K, but it’s known as WQHD+, and 12:5 is weird for sure.
 
The 3070 class buyer will be a 4070 class buying all else considered
Probably in part, and why they see a grow in the 700+ GPU sales

In any case, this article stating that their sales are up 40% doesn't jive with the behaviour I've seen at retail.
They would have so many more SKUs in a bigger world gaming community that it make it a bit hard, one big difference is the ability to ship all the sales (except for early 4090 days) versus Ampere debut. They have a section about people upgrading a lot what they pay a lot more for their cards in 2023 than what they paid for the card being replace

I didn't read it that way, myself,
It is unclear (as spin do), the part that made me feel that it was what they meant was this one where it is very clear (that segment would be about price not card names):
Meanwhile, the most important data point provided by NVIDIA is that while the $699 US+ segment has seen a 3x ramp over Turing GPUs with the newest GeForce RTX 40 "Ada" offerings


I suspect they are using the same definition for the Ampere predecessor, the 700+ segment, and not the 4080 vs 3080 sales.
 
How powerful a GPU do you need for VR ? Is 4060 ti 16gb enough?
How about Intel Arc or AMD RDNA 3 16gb cards 🤔
When it comes to VR it's less about how powerful the GPU is and more about the GPU drivers (at least when comparing Nvidia to AMD). It's been a thing for years (even way back to when [H] did testing) that AMD GPUs that are competitive with Nvidia cards in raster have much worse VR performance. If you buy an NV card you're pretty much guaranteed to have great performance in VR; with AMD you're at the mercy of their driver update team. No idea how Intel fairs.
 
Last edited:
I get the feeling these slides are about mostly OEM sales, desktop sales are down, discrete GPU sales are down, laptop sales are up.

That tracks more with what I see but I’m just some asshole what do I know.
 
4K refers to horizontal resolution so by the same metric 1080p would be 2K ~2000 pixels horizontal resolution. And 1440p would then be 2.5K.
I thought the same but no, that when talking in the movie world where it is easy to follow with a single simple rules.
  • 2K resolution – digital video formats with a horizontal resolution of around 2,000 pixels
  • 5K resolution – digital video formats with a horizontal resolution of around 5,000 pixels, aimed at non-television computer monitor usage
And so on.

In video game-PC world it is strange.

Some seem to call 1440p 2K because it is almost twice as much pixel than 1080p ?, 4k because it is twice as much than 2K, but they use the word 8K to call a resolution that has 4time as many pixels than 4K and if a 6000x4000 monitor would be released it would be called 6k, make no sense.
 
Last edited:
I thought the same but no, that when talking in the movie world where:
  • 2K resolution – digital video formats with a horizontal resolution of around 2,000 pixels
  • 5K resolution – digital video formats with a horizontal resolution of around 5,000 pixels, aimed at non-television computer monitor usage
And so on.

In video game-PC world it is strange.

Some seem to call 1440p 2K because it is almost twice as much pixel than 1080p ?, 4k because it is twice as much than 2K, but they use the word 8K to call a resolution that has 4time as many pixel than 4K and if a 6000x4000 monitor would be released it would be called 6k, make no sense.
Hence why just writing out the resolution is best :). I've gotten lazy on it too :p.
 
You'll get 12gb and a essay of cope about how you don't need more than that.
I don't think anyone, including myself, has ever said you don't need more than 12GB, and if they did they're idiots. At 1440p, I game perfectly fine with maxed everything, including textures, with 12GB, but when the day comes to start lowering settings, it'll most likely not be because of VRAM, but if it is, all I'd need to do is lower textures from ULTRAOMG4K to High, and still enjoy a smooth experience. (y)

One last thing, remember any game that'll tax a 12GB 4070Ti, is also going to tax a 20GB 7900XT. VRAM won't help much when your average frame rate dips into the 30's and 40's other than produce better 1% lows, but as I always say--Better 1% lows of shit, is still shit.
 
Nvidia has us by the balls. Except it doesn't. They make good products and people buy them. That's it. If you don't like it, don't buy it. And then be labeled an economic terrorist. Oh - that's only if Nvidia has a pride flag on their website. Seriously, what's the point of this thread?
 
Oh, not yet. But Nvidia did put out that AMD-level cope page right before the abysmal 4060ti launch for a reason.
I find it quite funny how people react to a page explaining why they went with lower VRAM. It's like Nvidia pissed in peoples cornflakes, but how quick they are to completely ignore AMD goading Nvidia over 8GB of VRAM but then releasing an 8GB VRAM card.
 
Back
Top