Roy Taylor is wrong again: Watch_Dogs 2 does not support DX12, not "optimized for AMD"

Status
Not open for further replies.

TaintedSquirrel

[H]F Junkie
Joined
Aug 5, 2013
Messages
12,739



http://gamegpu.com/action-/-fps-/-tps/watch-dogs-2-test-gpu

7dmy4aN.png

dc5x0H2.png

Xi0RoOm.png


http://www.pcgameshardware.de/Watch-Dogs-2-Spiel-55550/Specials/Test-Review-Benchmark-1214553/

dSNDcLQ.png
 
Last edited:
Exactly. I am not shocked this is bullshit.. Wasn't that post out before the Pascal was even released?
 
I don't get it. AMD touted a partnership with Watch Dogs 2 8-9 months ago according to that tweet, and then went quiet for 8 months while Ubisoft and Nvidia released a trailer and touted their partnership. I guess if you were making the point that AMD can't compete with Nvidia on game development partnerships, you did a good job. But to bring up a tweet from 8 months ago to simply call someone wrong, when its obvious Ubisoft decided to switch focus to Nvidia is pretty...petty? I don't know how to express this with the proper words...i mean, really? You bring up a 8 month old tweet to call him "wrong" for what reason? Do you have more recent tweets? Did he tout AMD-Watch Dogs 2 cooperation when Nvidia released their trailer?

I don't get this. Did you want him to say 4-5-6 months later from that tweet, "AMD is no longer alongside Watch Dogs 2 development" ? I guess if thats your point, to demand AMD cut off its own foot. But why do you have to call him wrong when the evidence seems pretty clear that Ubisoft switched to Nvidia? Did AMD tout it after the switch? Is that what your calling out?

I mean you have plenty to slam the guy on, but this Watch Dogs 2 DX12 stuff is not one of them.


And those benchmarks aren't using the latest AMD driver.
 
I don't get it. AMD touted a partnership with Watch Dogs 2 8-9 months ago according to that tweet, and then went quiet for 8 months while Ubisoft and Nvidia released a trailer and touted their partnership. I guess if you were making the point that AMD can't compete with Nvidia on game development partnerships, you did a good job. But to bring up a tweet from 8 months ago to simply call someone wrong, when its obvious Ubisoft decided to switch focus to Nvidia is pretty...petty? I don't know how to express this with the proper words...i mean, really? You bring up a 8 month old tweet to call him "wrong" for what reason? Do you have more recent tweets? Did he tout AMD-Watch Dogs 2 cooperation when Nvidia released their trailer?

I don't get this. Did you want him to say 4-5-6 months later from that tweet, "AMD is no longer alongside Watch Dogs 2 development" ? I guess if thats your point, to demand AMD cut off its own foot. But why do you have to call him wrong when the evidence seems pretty clear that Ubisoft switched to Nvidia? Did AMD tout it after the switch? Is that what your calling out?

I mean you have plenty to slam the guy on, but this Watch Dogs 2 DX12 stuff is not one of them.


And those benchmarks aren't using the latest AMD driver.
AMD is notorious for their misleading marketing.
Just because it's been nearly a year doesn't mean they get a pass on this. That's just how I feel, though.
 
You can actually consider this game is actually relatively neutral in terms of hierarchy between AMD and Nvidia in DX11. If you consider Battlfield 1 "optimized" for AMD, AMD's cards in Watch Dogs 2 perform actually relatively slightly (albeit very slightly) better in this title at least looking at Gamegpu numbers.
1920_11.png


Whether or not this means the title is well optimized for either IHV well that might depend on your criterion as well other influences.

As for DX12 well - http://www.hardocp.com/article/2016/10/24/battlefield_1_video_card_dx12_performance_preview/6

Should you play BF1 in DX11 or DX12?


One of the questions we wanted to ask, and answer, is if you should play BF1 in DX11 or DX12 for the best gameplay experience?

We have discovered that BF1 is highly optimized and runs well in DX11. Performance is very good on an AMD Radeon RX 480 and NVIDIA GeForce GTX 1060 in DX11. We can run the campaign mode at 1440p with the highest possible in-game settings on both video card and still average above 60 FPS. At 1080p performance is well into the 90’s to 100’s, which is incredible. However, you must use DX11 to experience this bliss in gaming performance.

When switched to DX12, BF1 falls flat on its face. There is no performance improvement, and there are no visual quality improvements. In fact, there is a negative return on framerate. On both the AMD Radeon RX 480 and NVIDIA GeForce GTX 1060 the framerates go south and provide worse performance while gaming, at 1080p and 1440p.

Given that there is always going to be a limited focus on resources, those might be better spent in other areas other then marketing hype/check boxes in terms of actual user experience benefits.

People like to hate on Game Works but really at the moment those additional Game Works effects are actually more beneficial to users than a lot of these marketing driven DX12 implementations.
 
Last edited:
Well looks like the performance is pretty close to being the same to me. The CPU numbers are awesome also. I like the fact that the 5960X is putting in work!

wd2_proz.png
 
I was impressed cf and sli are actually working.

Can't blame tainted for holding a company to their word.

Lets not even bring up that on the RX 480 box it says it is VR rdy....when it sucks arse.
 
Roy the notorious liar being wrong? Noooo......Who didn't lie blatantly of the AMD VPs yet?

Anyway, Look at the PCGH benchmarks with all the fun on. Pre GCN 1.3 ages horrible. Polaris beating Fury X etc. The Primitive Discard accelerator is bad news for older GCNs.
 
Well if you compare it to other enthusiast chips the picture probably won't be so pretty. The second best cpu on that list is a 6700, which is not an enthusiast chip not even the high end mainstream one.

Yep, looks to be more cache oriented.

Is the benchmark scripted or done via actual gameplay?
 
I don't get it. AMD touted a partnership with Watch Dogs 2 8-9 months ago according to that tweet, and then went quiet for 8 months while Ubisoft and Nvidia released a trailer and touted their partnership. I guess if you were making the point that AMD can't compete with Nvidia on game development partnerships, you did a good job. But to bring up a tweet from 8 months ago to simply call someone wrong, when its obvious Ubisoft decided to switch focus to Nvidia is pretty...petty? I don't know how to express this with the proper words...i mean, really? You bring up a 8 month old tweet to call him "wrong" for what reason? Do you have more recent tweets? Did he tout AMD-Watch Dogs 2 cooperation when Nvidia released their trailer?

I don't get this. Did you want him to say 4-5-6 months later from that tweet, "AMD is no longer alongside Watch Dogs 2 development" ? I guess if thats your point, to demand AMD cut off its own foot. But why do you have to call him wrong when the evidence seems pretty clear that Ubisoft switched to Nvidia? Did AMD tout it after the switch? Is that what your calling out?

I mean you have plenty to slam the guy on, but this Watch Dogs 2 DX12 stuff is not one of them.


And those benchmarks aren't using the latest AMD driver.

They both do this though.
Look at Nvidia with Doom and presenting how great Vulkan was when they launched Pascal, yet Doom had greater work with AMD (context Vulkan) as they had time and engagement with AMD to use the low level GPUOpen extensions that greatly enhance performance (same way Nvidia does similar with OpenGL but not yet fully with Vulkan).

Regarding drivers, PCGamesHardware is using what seems the latest the 16.11.5 came out 28/11.
I tend to ignore GameGPU.

Cheers
 
Last edited:
They both do this though.
Look at Nvidia with Doom and presenting how great Vulkan was when they launched Pascal, yet Doom had greater work with AMD (context Vulkan) as they had time and engagement with AMD to use the low level GPUOpen extensions that greatly enhance performance (same way Nvidia does similar with OpenGL but not yet fully with Vulkan).

Regarding drivers, PCGamesHardware is using what seems the latest the 16.11.5 came out 28/11.
I tend to ignore GameGPU.

Cheers

Quick update from them. Nice.
 
AMD is notorious for their misleading marketing.
Just because it's been nearly a year doesn't mean they get a pass on this. That's just how I feel, though.

So is Nvidia, Gsync is the most overhyped pos user tax product I can think of, when it rarely works great but they never tested using it. Surprised they haven't had an article written on how bad it is.

Both do this nonsense
I bought a 1070 laptop believing it would be just like desktop as per all their marketing... Well it sucks really really bad.
 
So is Nvidia, Gsync is the most overhyped pos user tax product I can think of, when it rarely works great but they never tested using it. Surprised they haven't had an article written on how bad it is.

Both do this nonsense
I bought a 1070 laptop believing it would be just like desktop as per all their marketing... Well it sucks really really bad.
I bet you bought an HP laptop. GSync is known to be broken on it. (Since you don't bother to address any comments made to you in other threads I'm dropping this here.)
 

Quick update from them. Nice.
Yeah,
strange how AMD took awhile to get the drivers out to reviewers (PCGameshardware mentioned in their article about repeated requests for it), but then on the plus side the improvement to their driver side service has been great.
I notice though this has been since they employed one of the senior Nvidia driver team managers, must had paid him a small fortune to jump.
Cheers
 
I'm glad you started this thread. Just the other day I came out of my house and saw AMD kick my dog really hard then run away. The other night I was pretty sure it was AMD that was hitting on my wife at the restaurant as well. I've also heard rumors that AMD is aligned with ISIS but nothing has been substantiated yet. I'm glad you can hold such petty grudge so long because we cannot let these atrocities continue. I've never heard of Marketing embellishing their products before so we MUST nip this in the bud...thank goodness for people like you. :meh:
 
Last edited:
I'm glad you started this thread. Just the other day I came out of my house and saw AMD kick my dog really hard then run away. The other night I was pretty sure it was AMD that was hitting on my wife at the restaurant as well. I've also heard rumors that AMD is aligned with ISIS but nothing has been substantiated yet. I'm glad you can hold such petty grudge so long because we cannot let these atrocities continue. I've never heard of Marketing embellishing their products before so we MUST nip this in the bud...thank goodness for people like you. :meh:

Seems like you haven't really been paying attention to the shenanigans AMD marketing has been pulling lately.
 
Seems like you haven't really been paying attention to the shenanigans AMD marketing has been pulling lately.

This! It would be one thing if Roy's comments were a one off but they aren't. Both AMD and Roy are constantly touting AMD as the fastest GPU's for the money when they plainly aren't. VR has shown this along with the BetterRed marketing vs actual performance.
 
You mean companies hype their product to stupid levels instead of saying "meh, go by an nvidia card"? Colour me shocked...

Well, in terms of VR, what AMD is doing is pretty much false advertising. There is no "premium" VR experience to be had with an RX 480.
 
Hey, Samsung phones are obviously being advertised as a Premium VR experience. Guess it is all about perspective. Personally, I am not going to bother since the only games I would play with VR would be flight simulations and that would definitely cause motion sickness no matter the fps. :yuck:
 
Yes, AMD premium VR experience will be unforgettable - you will be heaving chunks of reality to disperse with the premium VR experience making an everlasting memory in time. :vomit:
 
Well looks like the performance is pretty close to being the same to me. The CPU numbers are awesome also. I like the fact that the 5960X is putting in work!

View attachment 11316
Looks like my two 1070's will work well with this game, helps that the game was given for free too with the 2nd 1070. That 9590 just keeps on giving and climbing that game ladder over time ;).
 
Also remember how often AMD and Roy went on about DX12 performance advantage with AotS compared to Nvidia but now Nvidia is actually either equal or better (980ti outperforms Fury X now, 1060 and 480 are pretty much equal with just a slight lead for either depending upon resolution and whether benchmark or in-game) they have now moved on to other aspects, which is ironic as this is still the only true full performance DX12/async 'game benchmark' available (differentiating this from DX12 Time Spy that is only a benchmark utility, or DX12 games that still do not match its complete development).

Examples where well monitored/benchmarked:
ASUS-1080-1070-48.jpg



And amusingly notice that Fury X is actually ahead to the reference 980ti in DX11 but not DX12 with async compute :)
Ok yeah in theory Fury X may still be weaker due to the minimum frames.

ASUS-1080-1070-30.jpg


Reference 1060 and 480.
GTX-1060-REVIEW-78.jpg


Worth noting as well Nvidia's performance relatively improves in game compared to the benchmark albeit not dramatically, so these figures would subtly change again to Nvidia's favor as this is based upon the internal benchmark albeit with PresentMon.

So once it no longer suited their narrative that AoTS shows Nvidia is weak across the model range when using DX12/async compute compared to AMD models they managed to sweep it under the carpet, if they had not made such a noise regarding AoTS performance and how weak Nvidia was I would not had minded so much myself.
And yes before anyone singles out the 980 is still weak compared to say 390x in AoTS, context though was AMD used the game in general as a point to highlight weakness of Nvidia architecture and solution, when in reality it was nothing to do with the Nvidia's architecture/solution as to why AMD was so outperforming Nvidia back then in AoTS.

Cheers
 
Last edited:
Also remember how often AMD and Roy went on about DX12 performance advantage with AotS compared to Nvidia but now Nvidia is actually either equal or better (980ti outperforms Fury X now, 1060 and 480 are pretty much equal with just a slight lead for either depending upon resolution and whether benchmark or in-game) they have now moved on to other aspects, which is ironic as this is still the only true full performance DX12/async 'game benchmark' available (differentiating this from DX12 Time Spy that is only a benchmark utility, or DX12 games that still do not match its complete development).

Examples where well monitored/benchmarked:
ASUS-1080-1070-48.jpg



And amusingly notice that Fury X is actually ahead to the reference 980ti in DX11 but not DX12 with async compute :)
Ok yeah in theory Fury X may still be weaker due to the minimum frames.

ASUS-1080-1070-30.jpg


Reference 1060 and 480.
GTX-1060-REVIEW-78.jpg


Worth noting as well Nvidia's performance relatively improves in game compared to the benchmark albeit not dramatically, so these figures would subtly change again to Nvidia's favor as this is based upon the internal benchmark albeit with PresentMon.

So once it no longer suited their narrative that AoTS shows Nvidia is weak across the model range when using DX12/async compute compared to AMD models they managed to sweep it under the carpet, if they had not made such a noise regarding AoTS performance and how weak Nvidia was I would not had minded so much myself.
And yes before anyone singles out the 980 is still weak compared to say 390/390x in AoTS, context though was AMD used the game in general as a point to highlight weakness of Nvidia architecture and solution, when in reality it was nothing to do with the Nvidia's architecture/solution as to why AMD was so outperforming Nvidia back then in AoTS.

Cheers
This is why I never take you seriously, a wolf in sheeps clothing if you will.

upload_2016-11-30_6-44-20.png


I am guessing you are using a 1070 review for those being they have a number of them in the list. This is from a 1070 review July 6,2016. This is also why I am not a big pusher of benches for anything more than "BALLPARK" answers. Seriously you are trying too hard to bash AMD. I cant say I have ever seen you show this kind of irreverence for Nvidia nor Intel, though in your defense I haven't seen you post much on CPUs.
 
Status
Not open for further replies.
Back
Top