More like 5 years ago, and yes, 8k is pointless for the next 5+ years.Just a couple of years ago some said "4K is pointless, the pixels are so tiny" etc... Well, here we are.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
More like 5 years ago, and yes, 8k is pointless for the next 5+ years.Just a couple of years ago some said "4K is pointless, the pixels are so tiny" etc... Well, here we are.
DLSS actually looks better than native 4k in Death Stranding while also having better performance.Or you could, you know, just display at full res without resorting to all the bs...
just sayin
DLSS actually looks better than native 4k in Death Stranding while also having better performance.
just sayin
Flat 8K panels I don't think will ever be worth it for the average PC gamer using a desktop monitor--couch/tv gamers maybe. I can however, easily see 8K or even higher resolution VR being HUGE.More like 5 years ago, and yes, 8k is pointless for the next 5+ years.
Yep. And while AMD is guaranteed to be enjoying decent margins on these, the same can't be said for Nvidia, or both Nvidia and Samsung. There's little doubt that their margins are slim to none right now. I've heard that Samsung is essentially eating all their costs. They can afford to, no prob, they're huge and it's better to have a good partner, especially one that is presumably working *very* hard to improve their manufacturing, but ultimately:No need to lose any more money here.
Flat 8K panels I don't think will ever be worth it for the average PC gamer using a desktop monitor--couch/tv gamers maybe. I can however, easily see 8K or even higher resolution VR being HUGE.
When I say signficantly (price) - a 3090 for $1500 MSRP vs a 6900XT for $1k..I wouldn't say significantly, outside the 3090 but they don't have anything up there to counter the CUDA environment and despite their 8K gaming bullshit, the 3090 is a workstation card. But serious shots fired, I look forward to seeing NV's response and not their press release I want to see their boots on the ground war faces.
More like 5 years ago, and yes, 8k is pointless for the next 5+ years.
Meh, who is counting, I'd rather not think how much I have aged. Anyway my original point was that since pushing rendering resolution up with brute force performance is getting harder the stuff like DLSS (and whatever AMD equivalent is going to be) is the future, I firmly believe. Hopefully there will be hardware vendor agnostic open solutions eventually and not just DLSS with Nvidias stranglehold.
Nope, but rumors have said early 2021.I didn't watch but I'm guessing there's no date for mid-range cards yet?
I didn't watch but I'm guessing there's no date for mid-range cards yet?
That’s why I said outside the 3090, the pricing for the rest of the stack is pretty close. I’m not going to count scalper prices for anything because until AMD’s cards hit the shelves there is nothing saying their cards will fare any better.When I say signficantly (price) - a 3090 for $1500 MSRP vs a 6900XT for $1k..
$500 - that's pretty significant I'd say. Plus.. Try finding a card for MSRP. Ebay is the only place to get a 3090 and they are going for $2300+
This BS will end in a month.
I’ve been told they will be announced at CES along with the EPYC’s and Threadripper’s.I didn't watch but I'm guessing there's no date for mid-range cards yet?
I wouldn’t call it a stranglehold, NVidia has poured large resources into CUDA and it’s reaping many benefits. It’s good, and there isn’t anything quite like it out there that does what it does better or easier.Meh, who is counting, I'd rather not think how much I have aged. Anyway my original point was that since pushing rendering resolution up with brute force performance is getting harder the stuff like DLSS (and whatever AMD equivalent is going to be) is the future, I firmly believe. Hopefully there will be hardware vendor agnostic open solutions eventually and not just DLSS with Nvidias stranglehold.
AMD Drivers aren't bad. This is an old trope. They haven't been bad since the 90's.Not sure if this is the place to discuss this, but I'm getting a lot of feedback AMD's drivers? They that bad?
Yeah, I use nvenc a lot (shadowplay) so am curious if they improved their encoder as well.
Darn, that's far out, oh well guess I'll see if I spend a bit more on a 6800You are correct -- not a peep on any supposed "6700 and under" cards yet (wasn't mentioned in today's reveal).
AMD Drivers aren't bad. This is an old trope. They haven't been bad since the 90's.
Eh...from my experience they haven’t been bad, but they have been “less polished” than Nvidia’s even in the 2000’s and early 2010’s. I’ve had an ATI 9800 and HD 5870, and an AMD HD 7970 and a R9 390. Definitely still had more bugs with them than my Nvidia cards, but nothing unmanageable. I haven’t had an AMD card since 2016 so I don’t know their quality over the past 4 years.AMD Drivers aren't bad. This is an old trope. They haven't been bad since the 90's.
Not at all, not even close. (video by forum member chameleoneel)
Way to many people buying into the DLSS is magic BS.
No they did say that all their cards support raytracing in all existing and upcoming games and those numbers they showed were supposed to be at max settings.Definitely. It was kinda weird that AMD didn’t talk about existing raytracing titles like Control or Cyberpunk. It would be terrible if they don’t run on the 6 series.
AMD didn’t talk raytracing performance at all.
hell no. nothing like apple. come again?AMD is the new Apple?
What a load of crap. I'm pretty sure my eyeballs work just fine and if I run death stranding at native 1440p it looks like s*** compared to running quality dlss. Native resolution has way more aliasing and crawling than using dlss quality mode.
don't they have different amount of cu's?In for a RX 6800, especially if it can be flashed with the 6800 XT BIOS like we've been doing for ages now.
The anti-aliasing from DLSS in Death Stranding is not in question. I mention in the video that its really good. Even with basically no aliasing shimmer during high motion.What a load of crap. I'm pretty sure my eyeballs work just fine and if I run death stranding at native 1440p it looks like s*** compared to running quality dlss. Native resolution has way more aliasing and crawling than using dlss quality mode.
Each architecture does raytracing differently. You are going to need games to be optimized. It's not a matter of just flicking on a switch and getting good performance out of it.I find it curious that RayTracing performance was not at all talked about. Must not be a strong point of this generation. Which is fine, as the wide scale of adoption isn't there yet. Even if AMD is 80% good as RTX then Nvidia, thats a good start.
Let me talk facts instead of opinions. It is 100% fact that dlss 2.0 in quality mode looks better overall than native resolution while running better.I can attest personally, dlss does not magically increase image quality in any title to date. It does give you extra fps. Glad you can't tell the difference, but I have noticed it in every title.
The idea that somehow you get something for nothing is a marketing position, not reality. its an upscaler and has ALL the issues of upscaling, just to a reduced degree by algorithm.
In my experience, the video is 100% accurate in regards to death stranding.
They are not, most of AMD's supposed driver issues have been caused by other things, the faulty ram chips being one, they also use a simpler power delivery system so PSU's that are operating outside of spec can cause a real headache for them where an NVidia card will still operate within reason. There have been a few notable driver problems but most of them were windows related and from years back and were addressed in a moderately reasonable timeframe?Not sure if this is the place to discuss this, but I'm getting a lot of feedback AMD's drivers? They that bad?
Let me talk facts instead of opinions. It is 100% fact that dlss 2.0 in quality mode looks better overall than native resolution while running better.
RX 6800 > 3070 and comes out on 11/18/2020I didn't watch but I'm guessing there's no date for mid-range cards yet?
Let me talk facts instead of opinions. It is 100% fact that dlss 2.0 in quality mode looks better overall than native resolution while running better.
The RTX titles used DXR 1.0, the DXR 1.1 spec is the one rolled into DX12U and maintains full backward compatibility. NVidia, didn't deviate from the spec so unless the developers did some funky implementations the AMD cards should work out the gate or with a minor patch.Zero mention of any improvements to AMDs encoders, and zero showing on any DXR equipped game already out this year or put out last ,speaks volumes.
wrong. they already said that it's all based off of DX12U DXR even nvidia's RTX. and they said that the cards will do raytracing in all existing and future titlesEach architecture does raytracing differently. You are going to need games to be optimized. It's not a matter of just flicking on a switch and getting good performance out of it.