Zarathustra[H]
Extremely [H]
- Joined
- Oct 29, 2000
- Messages
- 38,835
Eh? There were 4k screens in 2005? Must of been stupid expensive.
That was as typo. Should have said 2015. I'll fix it.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Eh? There were 4k screens in 2005? Must of been stupid expensive.
Yeah but MS Flight Simulator is the new "Crysis". That said I have zero desire to run MS:FSAnd if you ever want to run MS Flight Simulator, well...
We're probably a good two years away before Ray-Tracing becomes more of a standard. That's how games have been historically after consoles are released. Xbox 360 was released in 2005 and in 2007 alone we get Bioshock, Portal, Halo 3, Mass Effect, Crysis, and etc. From that point forward we get games that make good use of the hardware. For the Xbox One and PS4 we didn't get games that made good use of the hardware until 2015, which is two years later. Witcher 3, MGSV, Batman Arkham Knight, Dying Light, Project Cars, and etc are all just in 2015 alone. We still benchmark with the Witcher 3 to this day.
Assuming COVID doesn't screw up the gaming schedule then 2022 will be a hell of a year for gaming. Good chance a lot of those games will be using Ray-Tracing. If the AMD RDNA2 cards are already struggling with Ray-Tracing on todays games then there's a good chance they may not be playable at 60fps in two years from now. Nvidia on the other hand has a good handle on Ray-Tracing, and if your RTX 3070 can't handle it then just turn on DLSS. You are paying $1k for a graphics card for what? To play todays games at or bellow 200fps? Watch Doom Eternal reach 300fps? Doesn't make sense to me.
The point of Minecraft is that it's implementation of path-tracing maybe how games ideally will handle Ray-Tracing in the future. How future proof are AMD's RDNA2 cards if Path-Tracing is difficult for them?
This right here is why you don't buy these AMD cards at those prices. If the $500 RTX 3070 performs faster in Minecraft Ray-Tracing than a $1k 6900XT then AMD fucked up. Ray-Tracing is going to be the future of games and ignoring the Ray-Tracing performance for $500+ graphic cards is just stupid. What's worse is that this $1k graphics card still comes with GDDR6 and not GDDR6X like the RTX 3080 and 3090, so higher resolutions tend to get worse. If AMD wanted a big win then they should have used GDDR6X.
but it is not usable yet (ray tracing) so it's moot... like Kyke pointed out at least 2 generations awayNot when you get RT involved.
Within a year, more-demanding RT games are going to all be slower-performing than the (cheaper) 3080.
The target-market of this card is really hard to define (same vram as 6800, so no real $300 value there).
Unlikely. Nvidia will just release the 3080 Ti which will have 99% performance of the 3090 and with 20GB VRAM for $999 (probably $1100-1200 for AIBs) to compete directly with the 6900XT. The 3090 will continue to be the “top end” product that will only compete with itself at $1500.For some reason I had much higher expectations of the 6900XT
Great value when compared to the 3090, but when compared to 3080 or 6800XT, maybe not so much =/
I wonder what if anything nVidia will do to the 3090 pricing.
Arguably not usable for AMD. It is usable on a high end Ampere however, when combined with DLSS 2.0.but it is not usable yet (ray tracing) so it's moot... like Kyke pointed out at least 2 generations away
I'm glad AMD is at least competitive and has a halo product.Was hoping AMD might have killed it with this, and while for the price it’s an amazing piece of hardware, I was hoping they would have taken the performance crown for once. I’m still Happy with my RTX 3090 purchase. Yeah, it’s more expensive, but this is [H] and it’s still the fastest.
For some reason I had much higher expectations of the 6900XT
Great value when compared to the 3090, but when compared to 3080 or 6800XT, maybe not so much =/
I wonder what if anything nVidia will do to the 3090 pricing.
To be fair, it's only an issue because of idiotic resolution increases. I'd be fine still with a 27'' 1080p monitor, personally, and if that were the case a 1080ti would still be monster for that resolution. However, if you want an updated monitor that has decent contrast ratio or features like HDR you aren't getting 1080p anymore.
Honestly a 5-year-old 4c/8t PC has been future-proof this whole time. Maybe you can blame Intel for resting on their laurels, but it's only been in this last month that any new hardware has come out that dramatically shifted needs over to a new generation.I really don't get the whole "future proofing" argument, since nothing is "future proof". The situation changes every year, especially now with all the new features that constantly come out.
There were, sort of, and they were very expensiveEh? There were 4k screens in 2005? Must of been stupid expensive.
Well it all depends, versus the late 80s/90s past computer in the 2010s have had quite the long practical life.I really don't get the whole "future proofing" argument, since nothing is "future proof". The situation changes every year, especially now with all the new features that constantly come out.
To be fair, it's only an issue because of idiotic resolution increases. I'd be fine still with a 27'' 1080p monitor, personally, and if that were the case a 1080ti would still be monster for that resolution. However, if you want an updated monitor that has decent contrast ratio or features like HDR you aren't getting 1080p anymore.
I really don't get the whole "future proofing" argument, since nothing is "future proof". The situation changes every year, especially now with all the new features that constantly come out.
Maybe that was true at one point, when things were simpler, like with the 8800 GTX, yeah you got some years on that, but these days it's just not the case.
Makes more sense to me to buy what you need today, and then sell the card in 1 to 2 years and get whatever's new. And with the used market in these times you can typically get a good amount of money back.
I'd say 1440 is a decent upgrade at 27'', and i'd be fine with that. The issue is that to get better panel technology all the panels at that size are now 4k minimum.Sorry, but I disagree. Moving from 1080p to 1440p was a huge difference for me. You’re close enough to the screen that higher resolutions actually matter. People say the same thing about “the human eye can only detect 60Hz”. I was fine with 60Hz, until I bought a 165Hz monitor. Now I find 60Hz painful to look at, like it literally causes me eye strain. There is an upper limit to where things make sense, but 60Hz/1080p is definitely not that upper limit.
I'd say 1440 is a decent upgrade at 27'', and i'd be fine with that. The issue is that to get better panel technology all the panels at that size are now 4k minimum.
And again, personally, my perfect panel would be a 27'' 1080p HDR 1000 144hz+ 10-bit IPS panel. Unfortunately, you've got to jump all the way up to 4k to get the good specs.
Don't confuse my disdain for ultra high PPI for refresh rate. Each human is different. I get that. I personally can tolerate the PPI of 1080p on a 27'' panel. Although what I can't tolerate is low refresh rate, and shitty contrast ratio, etc.