Rise of the Tomb Raider DX11 vs. DX12 Review @ [H]

Good interesting article.

I don't think DX12 is really going to be all that big a deal and mostly just be a wash. I don't think an API that slows things down unless the hardware manufacture optimizes it is really a move forward at all. I mean why do standards (which is what an API is) exist at all if clearly no one follows anything. (hence needing optimization all the time)

DX 12 allows low level access so its really up to the developer to get in and get their hands dirty making things go. Relying on the hardware guys to optimize everything for every title is just going to lead to more shortcuts then optimizations. I think DX12 is an API where every game released will have to be judged on its own merit, cause some DX12 titles are going to be lean an mean and have the potential to be blazing fast, and if developers just go in and change some file headers to use there High level DX 11 calls under DX 12 they will take a performance hit as we see here. (that is the real issue not optimization imo, its that this developer just turned a bunch of High level DX 11 code into DX 12 calls)
 
Would it be worth focusing more on minimum framerates in the future? As well as including average CPU utilization alongside the framerates? Streaming to a TV you'd want min 60, for VR min 90 (except for async timewarp maybe), and if you're over 100 avg you really don't care. At least including the CPU figure would give people an idea how screwed they are on various platforms. A slow CPU and fast GPU might be better than a whole new system. Games with relatively low DX12 CPU usage should be more playable on older systems.
Looking at that right now.
 
I wonder what the difference would be on the 8 core amd cpus? They probably would have much better min frame rates even if lower avg.....
 
Not necessarily, a GPU years from now could probably pull it off. I think benchmarking for certain FPS thresholds might be useful. Right now [H] does "best experience", but that's subject to whatever display you're using. 50fps on VR is probably not the "best experience". IMHO stable and sufficient minimum fps would be the best experience.
Duh!
 
I'm just saying they should be designing games that will also play well in the future. It won't be a ton of cash but look at all the bundles sold on steam that are still bringing in revenue for companies. Just because a game isn't suitable for VR now doesn't mean it won't work really well in the future. TR could be a fun game to replay with VR in the future.
 
I'm just saying they should be designing games that will also play well in the future. It won't be a ton of cash but look at all the bundles sold on steam that are still bringing in revenue for companies. Just because a game isn't suitable for VR now doesn't mean it won't work really well in the future. TR could be a fun game to replay with VR in the future.
They arent going to play worse.
Why are you making obvious and pointless statements?
 
The steam overlay FPS counter still works with DX12 in this game, except I have not found a way to record the data stream like what you can do with Fraps. Just finished this game and it was Awesome! Was at 82% but I went back and mopping up. I was surprised that the characters after the ending are in context with their discussions about rebuilding, what happened etc. Wow what an amazing attention to detail in this game even after you finish it.
 
No desire to reinstall this game. I'm still bummed they made Lara have a fat ugly face compared to the last Tomb Raider.

While that's not the most flattering screenshot of the new model I too liked the original better. That said I don't hate the new one; I think they changed it on the definitive (console) release of TR 2013. I'm guessing they made her look more like the voice actress so they could motion capture her with less adjustment.

ON TOPIC: I'm almost inclined to believe that AMD has heavily optimized the canned benchmark in Dx11 mode but not yet for DX12 . For real world game play the DX11 path is weaker but mature - so the benefits of DX12 must be canceling out lack of driver optimization (or one could hope that less such game-specific driver optimization is needed assuming skilled game devs)
 
They arent going to play worse.
Why are you making obvious and pointless statements?
For playing current games not designed for VR
Current games, at least some of them, should be designed with VR in the future. While it might not work now, even with lower settings, it probably will in the future.
 
Current games, at least some of them, should be designed with VR in the future. While it might not work now, even with lower settings, it probably will in the future.
Honestly, you baffle me.
How can they make games utilise VR effectively before any VR units are released?
Companies that are not involved in the development cycle have no chance.
I would rather they didnt waste money on assimilating ghosts and make a good game with the tech they know.
 
No desire to reinstall this game. I'm still bummed they made Lara have a fat ugly face compared to the last Tomb Raider.

WkmiOsd.jpg


Naw man, she just got hooked on the ganja and put on weight from the munchies...
 
Just ran about 60 sets of numbers using the benchmark for lowest FPS numbers.....sample size is so small it is horrible. Data can easily have a delta of 30%.
 
I think I've seen sites do 99% for frame times. You almost want something like that IMO. Or the average of the lowest 1%. Something where one errant reading can't skewer the data.

I am sure some of us would love a capability analysis but that's bit overboard. Even standard deviation might be too much. Has to be simple?
Not exactly needing to be spending days on this. I will crunch all the data in the morning and see if there is anything meaningful in it.
 
Honestly, you baffle me.
How can they make games utilise VR effectively before any VR units are released?
Companies that are not involved in the development cycle have no chance.
I would rather they didnt waste money on assimilating ghosts and make a good game with the tech they know.
Simple, design games with future patches or expansions for VR in mind. Something they can do without having to re-write the entire engine from scratch. Have control systems and visuals that should be compatible, but current performance isn't sufficient to pull off.
 
I wonder what the difference would be on the 8 core amd cpus? They probably would have much better min frame rates even if lower avg.....


On my FX 8350 setup paired with a 280x, I too noticed much worse performance in DX 12. So poor, I couldn't tolerate it longer than 5 minutes.
 
Please elaborate on what exactly you want to see and how we can accomplish that if there are no framerate capture tools that support DX12?
I'm sure they will come out soon enough. It's fine for now but I see no need to reinvent the wheel. Especially considering that with performance regression in this game running DX12 it's a complete waste of time. Except to show that it's broken.

You aren't seriously going to use in game benchmark tools now?
 
I'm sure they will come out soon enough. It's fine for now but I see no need to reinvent the wheel. Especially considering that with performance regression in this game running DX12 it's a complete waste of time. Except to show that it's broken.

You aren't seriously going to use in game benchmark tools now?

Our gameplay of highest playable settings will always come from manually playing the game in a real-world environment.

What is in question is the need to show framerate. Reference the discussion here - Measuring the Performance of Each Game will Necessitate...

This evaluation did not show framerate for the highest playable settings. Where you able to still figure out how the cards compared?
 
I thought the evaluation was fine. Seeing as it is for DX12 it would be cool to see how it stacks up with a dual core, AMD or budget scenario. I don't think that's really the point of [H] though.
 
So the new DirectX version didn't immediately bring the magic performance improvements that devs and architects were touting over the past year, like they have since, well... every new version of DX ever? Color my mind blown.

DX12 may offer a number of methods for optimizations, but it's going to take a long while before devs start using them. Most devs haven't even fully adopted the DX11 feature set...
 
Our gameplay of highest playable settings will always come from manually playing the game in a real-world environment.

What is in question is the need to show framerate. Reference the discussion here - Measuring the Performance of Each Game will Necessitate...

This evaluation did not show framerate for the highest playable settings. Where you able to still figure out how the cards compared?
Why are you asking such leading questions? :\

It would depend if I want to believe your subjective analysis without empirical data to back it up. Can you imagine if all reviews were done this way? How would you analyze the differences in the reviewer's opinions? One guy says it was a stuttering mess. The other guy says it was smooth as butter. They can't both be right. Or could they? How would you know? Frame rate over time and frame time variances are two measurements that could explain it. If the two reviews charts are the same and say they both show abnormalities, then you can assume one guy just isn't qualified to make a subjective analysis.

You nailed Crossfire dropping frames back in the day before FCAT or any other way of actually quantifying it. Even you though couldn't tell why 50fps in crossfire didn't feel like 50fps but much less. I understand that right now you can't do the measurements that you normally do. When you can though, I still want to see the frames/time graphs. :)

14579538765s7SVH5AIg_3_4.gif

Here, for example, you can clearly see where the nVidia cards are getting crushed at the final 3rd of the test. Makes it crystal clear where the issue is.
 
Seems they updated Tomb Raider again. Now Version 1.0.647.2. Now with the same settings for DX11 & DX12, I am seeing about 1-3 FPS difference using the built-in benchmark. A lot closer than it was on the previous version. Of course, YMMV.
 
On my FX 8350 setup paired with a 280x, I too noticed much worse performance in DX 12. So poor, I couldn't tolerate it longer than 5 minutes.

I am running a 8370 here to test with, running from a locked 4.3GHz to 2.8Ghz and I am not seeing any DX12 majik handwriting on the wall.

You aren't seriously going to use in game benchmark tools now?

That is certainly not what we want to do. The ROTR benchmark, for all intents and purpose, is SHIT.....but we know that some of our readers require some sort of FPS data, so we included it this time.

Seems they updated Tomb Raider again. Now Version 1.0.647.2. Now with the same settings for DX11 & DX12, I am seeing about 1-3 FPS difference using the built-in benchmark. A lot closer than it was on the previous version. Of course, YMMV.

Interesting, of course I just ran 60 sets of benchmarks last night. As for average FPS, these were usually withing 2 to 4 FPS. I will give it another quick look this morning to see if anything has changed at high and low GHz values.
 
On my FX 8350 setup paired with a 280x, I too noticed much worse performance in DX 12. So poor, I couldn't tolerate it longer than 5 minutes.
The 280X is a GCN 1.0 card, meaning it only supports DX feature level 11_1, so you're not going to be seeing what the supposed full benefits of DX12 are.
 
You guys can't really expect to call this a DX12 benchmark if you don't vary the CPU used. That's been the resounding call to arms as far as DX12 has gone.

This generation there are no amazing new shaders, only improved efficiency layers. This is why they have back-ported DX12 to more than just Maxwell and Fiji.

If you're not going to check into this efficiency improvement, the you're wasting your time.
 
WRT frame rates/etc: I don't know how easy it is for you guys to wrangle the FPS data sets you get, but aggregating them all (per test condition) and displaying a FPS histogram wouldn't be bad. Depending on the raw data, a few choice scripts would be able to build those plots (and using your entire testing set gives you a much better look at how a game performs in general).

Wasn't/isn't Tomb Raider one of the titles that tends to run better on GCN vs. Maxwell (speaking prior to DX12)?

Agree with the bottom line of the article, albeit, I'd probably hold off on any further testing until there's parity (or a very small margin) between DX11 and DX12 implementations. It just doesn't seem like there's useful info to be found (yet).
 
You guys can't really expect to call this a DX12 benchmark if you don't vary the CPU used. That's been the resounding call to arms as far as DX12 has gone.
This generation there are no amazing new shaders, only improved efficiency layers. This is why they have back-ported DX12 to more than just Maxwell and Fiji.
If you're not going to check into this efficiency improvement, the you're wasting your time.

I'm curious as to how you suggest they do that? How would you test "efficiency improvement"?
 
I'm curious as to how you suggest they do that? How would you test "efficiency improvement"?

Same way other sites do?

Test DX11 versus DX12 on overclcoked Core i7 like they normally do.

Test DX11 versus DX12 on an underclocked to 3.0 GHz Core i7, possibly also 2.0 GHz.

Disable two cores and repeat tests as "core i3" equivalent.

You should see increasingly large gaps in favor of DX12 performance as you make the game more CPU-dependent. And you can just run a benchmark on rails.
 
Actually it should be possible to capture framerates on DX12 with upper limit being refresh rate of display used.

But it cannot be done with software solution you need hardware like sites that measure performance of console games. Don't know if any of them published their secret sauce of how they do it.

Also DX 12 on TR seems to benefit hardware like i3 or FX cpus not top end configurations with i7 Skylake.
 
You guys can't really expect to call this a DX12 benchmark if you don't vary the CPU used. That's been the resounding call to arms as far as DX12 has gone.

This generation there are no amazing new shaders, only improved efficiency layers. This is why they have back-ported DX12 to more than just Maxwell and Fiji.

If you're not going to check into this efficiency improvement, the you're wasting your time.

I don't call it a benchmark,I call it a game. It is never a waste of time evaluating the performance, experience, and highest playable settings of a game people are playing.
 
Thank you all for the feedback on FPS data and whatnot. The last thing I want to do however is turn a video card review or game review into a science experiment. I'm no scientist, and this isn't science class. We don't have to be scientific about this, we need to make sure the focus is on the gameplay experience always.

I am really really getting the vibe the industry still isn't ready for non-FPS video card reviews, even though framerate doesn't tell us the truth often, we've seen it.
 
Thank you all for the feedback on FPS data and whatnot. The last thing I want to do however is turn a video card review or game review into a science experiment. I'm no scientist, and this isn't science class. We don't have to be scientific about this, we need to make sure the focus is on the gameplay experience always.

I am really really getting the vibe the industry still isn't ready for non-FPS video card reviews, even though framerate doesn't tell us the truth often, we've seen it.

That's because one person maximum playable settings aren't playable for someone else.

Someone can like 40-50 fps with maximum eye-candy I prefer 80-120 fps range even at cost of settings
 
Back
Top