FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,664
AMD & NVIDIA GPU VR Performance in Call of Starseed - We are back this week to take another objective look at AMD and NVIDIA GPU performance in one of the the top selling games in the VR-only realm, The Gallery Episode 1: Call of Starseed. This is another GPU-intensive title that has the ability to put some GPUs on their heels. How do the new RX 480 and GeForce 1000 series perform?
 
I've been keeping my eye on VR and that was a most useful article. I don't actually have the space for such a VR setup - it will be deskbound-only for me.
 
played this game awhile at CSVR in vancouver and it was by far my favourite VR game.. maybe by the time episode 2 comes out I'll be able to afford a VR setup ;)
 
Considering what it costs to buy into VR for the gear, and having the space to walk around (for those VR systems), it seems like VR will have a high dollar price point for the next year or two (or three) and so would largely be a luxury toy for gamers who can also afford a better GPU. Why spend all that money and cheap out on the most critical part of the system after the VR gear itself? No high end chips till Vega or later seems like it will really hurt AMD in this market segment.
 
It's cool seeing the 980Ti do so well in these VR tests. I remember hearing that it would do poorly in VR due to something with the Maxwell architecture. Forgot exactly what it was but it had something do with higher frame latency. Looks like what I heard was wrong.
 
You don;t need to drop to half frame with vsync on. that purely a doubble buffering issues. Triple buffering does not drop to half framerate because the GPU does not have to stall to wait for buffer swaps. nvidia fastsync is just an api hidden tripple buffering mechanism as well
 
NV's approach is definitely going to win out. They already help co-develop Unreal Engine 4 with Epic Games. PhysX is the physics engine built into UE4. Lots of GameWorks libraries are available that hooks into UE4's development path.

Physics Simulation

NVIDIA GameWorks and UE4

Get Ready: Unreal Engine 4 to Incorporate NVIDIA GameWorks VR | NVIDIA Blog

NV is aggressive on software optimizations and pushing it's DLLs into developers and getting into the base engine that is used for VR like UE4 is a pure win.

AMD hope it's OpenGPU approach will win but it's not going to happen. Devs have shown no such desire to use open source features on their own. They need enticement, either via software engineers from NV or AMD when games get sponsored, or from marketing sponsorship deals.

It's the same with GameWorks in the past, it runs better on NV, often a lot better especially in the first month before AMD is able to release a driver for it. So we've seen this battle already and it's one where NV wins.
 
Considering what it costs to buy into VR for the gear, and having the space to walk around (for those VR systems), it seems like VR will have a high dollar price point for the next year or two (or three) and so would largely be a luxury toy for gamers who can also afford a better GPU. Why spend all that money and cheap out on the most critical part of the system after the VR gear itself? No high end chips till Vega or later seems like it will really hurt AMD in this market segment.

dunno, with the current price of VR would it really hurt? i don't see people running out and buying it hand over fist. it's still a 1%'er market so i don't think AMD waiting on Vega is a huge loss as of right now. if vega falls flat on it's face next year then yeah AMD's definitely going to have issues in this market.


NV's approach is definitely going to win out. They already help co-develop Unreal Engine 4 with Epic Games. PhysX is the physics engine built into UE4. Lots of GameWorks libraries are available that hooks into UE4's development path.

Physics Simulation

NVIDIA GameWorks and UE4

Get Ready: Unreal Engine 4 to Incorporate NVIDIA GameWorks VR | NVIDIA Blog

NV is aggressive on software optimizations and pushing it's DLLs into developers and getting into the base engine that is used for VR like UE4 is a pure win.

AMD hope it's OpenGPU approach will win but it's not going to happen. Devs have shown no such desire to use open source features on their own. They need enticement, either via software engineers from NV or AMD when games get sponsored, or from marketing sponsorship deals.

It's the same with GameWorks in the past, it runs better on NV, often a lot better especially in the first month before AMD is able to release a driver for it. So we've seen this battle already and it's one where NV wins.


throw enough money at the problem of course you're going to win.. but from the people that actually do game development that i have talked to most of them hate having to implement any of the gamework shit into their games but refusing free money isn't an option.
 
Last edited:
One would hope DX11, as used in Starseed, will be used minimally going forward considering all of the advantages DX12 has to offer.
 
You don;t need to drop to half frame with vsync on. that purely a doubble buffering issues. Triple buffering does not drop to half framerate because the GPU does not have to stall to wait for buffer swaps. nvidia fastsync is just an api hidden tripple buffering mechanism as well
VR headsets are not presented as display devices.
The 3D options in the driver config do not apply.
 
It's interesting to see AMD competing where you would expect them in this title, Raw Data was a bit of a let-down. As much as having to re-do tons of tests must suck, we consumers are quite happy to receive frequent driver updates, if they actually fix things: it sure beats the AMD of old.
 
I wonder if this means AMD will cry wolf every game that uses Nvidia SMP tech for VR sigh (be interesting to see if they have any complaints with Obduction and its SMP that I bet will be put down to Gameworks more generally as it is a good narrative for AMD) - I appreciate this is limited to Pascal due to the changes in the Polymorph Engine.
With the latest low level functions being brought to PC to align with consoles, I do find it tiresome that AMD whines on a lot about Gameworks, when GPUOpen is also designed to bring in specific extensions/libraries and optimisations aligned to GCN hardware that will never work for aything else.
My post is not about whether one is open source or not, nor the rights/wrongs of both companies use of dedicated extensions/libraries for their hardware, but about one corporation cry wolf much more than the other, and this is more relevant now as more AMD is pushing for more alignment between functions used on the consoles and that with PCs requiring specific extensions/libraries.
As a quick example that GPUOpen lists is Barycentrics 12.

Read this page and next on Beyond3d about that and also SM 6.0, The interesting posts are those from Sebbi (who develops on consoles and possibly some aspects on PC) and Andrew Lauritzen who is a knowledgeable GPU/API engineer at Intel : Direct3D feature levels discussion
Basically all of these are only to date benefitting AMD.
Especially made clear in the posts: Direct3D feature levels discussion
Direct3D feature levels discussion
It is a combination of GCN hardware focused optimisation and also complexity for some other aspects to be implemented across diverse platforms, so if it is alignment between consoles and PC developers are most likely only to see best low level porting again for those functions on AMD hardware.

Now this is not a criticism of AMD, just a reality they should take advantage of IMO and one reason I am getting tired of AMD's repeated whining of Gameworks that is not just physx or hairworks but also optimised libraries just like AMD is doing.
Cheers
 
So AMD reply is via a tweet, LOL. At least Raja is responding and I think HardOCP is right on the money for telling how it is on the experience. Good or bad just the facts. Great that AMD did have a new driver correcting VR performance, that tells me AMD is actively working on VR at least. VR is still at such an early stage for consumers, it is hard to predict how this will pan out over time with the new API's, developer support and uptake and then actually real demand for VR games that make it all worth while for big developers to pour some money and resources into.

For VR I want to walk around and not teleport around is my first thought as well. Room size VR sounds great but one will be confined to a fixed room size, cable management etc. Not sure how long the WoW! effect will last with that. Many will have an issue with the cost but now also the room size or area set for this makes it even more remote. One solution is with Virtuix Omni which looks promising to me except they are not shipping yet. The price maybe cheaper than a dedicated room but one is no longer limited to a fix sized room either

Virtuix Omni first of its kind active virtual reality motion platform

For the sit down experience which actually for some game types is ideal (Car racing/driving, flying, space games etc.) Dirt Rally is probably a good one to look into. Rave reviews but looks to be more for the Rift but read it works with the Vive (not sure). Not everyone has the space nor will consider such as a viable option. Will a sit down VR system be worth it for games that naturally have a sit down experience? I do not see VR surviving if costs do not come down, high quality experiences that last beyond the Wow factor and a chunk of real estate required.
 
Last edited:
VR headsets are not presented as display devices.
The 3D options in the driver config do not apply.

Thank you i was unaware of that. but does that still rule out having triple buffering?
 
Once this technology matures it will be a game changer, I may convert from being a mountain man to a great indoorsman and never leave home again.
 
Is Carmack saying in other words FreeSync and GSync? Makes sense to me but needs to be proven on real people.

I didn't realize how old that tweet was when I posted it, so he was probably not talking about VR. Variable refresh rate is good for desktop gaming but would not work with the low persistence screens used for VR. Without a consistent framerate the perceived brightness of the image would be continually changing.
 
I didn't realize how old that tweet was when I posted it, so he was probably not talking about VR. Variable refresh rate is good for desktop gaming but would not work with the low persistence screens used for VR. Without a consistent framerate the perceived brightness of the image would be continually changing.
Thanks, that is a good explanation. I wonder if the screen type can be changed or that is not possible due to the high PPI of the VR screens?
 
Hmm what will happen with leaderboard when you add new gpus ?

Unless you retest everything then either data would be biased due to different sample size or if you test new gpu in old games it won't be fully fair comparision due to patches and driver optimisations ?

Also it would be nice to have table with such performance summary for each game separately in addition to average among several games.
 
It's cool seeing the 980Ti do so well in these VR tests. I remember hearing that it would do poorly in VR due to something with the Maxwell architecture. Forgot exactly what it was but it had something do with higher frame latency. Looks like what I heard was wrong.

The FUD and PR BS was wrong again. Who would have thought.
 
Hmm what will happen with leaderboard when you add new gpus ?
I will analyze its performance and experience and place it where I think it should go.

Unless you retest everything then either data would be biased due to different sample size or if you test new gpu in old games it won't be fully fair comparision due to patches and driver optimisations ?
That will likely not happen, so in that vein it will be a lot like all the charts I see posted in GPU forum all the time that people seem to very much like. That said, I have been up front and very clear that the Leaderboard is based on objective and subjective observations.

Also it would be nice to have table with such performance summary for each game separately in addition to average among several games.
Hmmm....yeah, I guess that would actually keep people from having to read the review. I have been trying to keep the The Bottom Line very clear on these VR reviews and will continue to do so for folks that do not want to dig into the analysis.
 
Any chance you could add GTX 970 to the comparisons? I think like a lot of people I have that card, the min spec for the htc vive, and am wondering if the upgrade to a newer card is worth it.
 
  • Like
Reactions: Nenu
like this
Thanks, that is a good explanation. I wonder if the screen type can be changed or that is not possible due to the high PPI of the VR screens?

The low persistence is desirable because it basically eliminates motion blur. The "sample and hold" of normal screens is bad for VR since your head can move quite a bit in the ~11ms that an image is displayed, .

Pixel density is not an issue because there are already higher PPI LCDs than the OLEDs used for VR. OLED is probably used because its lack of backlight allows for better black levels.
 
The FUD and PR BS was wrong again. Who would have thought.

I don't remember why people said this, but I'm going to go out on a limb and say it's because "async time warp" involves the term "async".

Why is the 980Ti dropping 1% of frames ? Any ideas ?
 
Any chance you could add GTX 970 to the comparisons? I think like a lot of people I have that card, the min spec for the htc vive, and am wondering if the upgrade to a newer card is worth it.

Probably not going to happen.

Why is the 980Ti dropping 1% of frames ? Any ideas ?
Did you read the article? Most of those missed frames happen during Reprojection during Blink transitions.
 
Thank you i was unaware of that. but does that still rule out having triple buffering?
Not in all games.
Some you can edit the games config files and enable it if you know the command line.
I no longer use a VR headset so havent kept up with it.
 
Tank you for the feedback but i wasn't really thinking from a user perspective I was thinking from a technical perspective.

top sum it up:

* The article mentions that you drop to half fps if you can't maintain 90fps due to Vsync.
* I mention that this is only the case if its double buffered. Triple buffering does not drop to half FPS because with the added buffer the GPU does not have to sit and wait for a monitor refresh to be done, before it has a buffer to render to. It was to question why they went ahead and make a new technology to a problem we already have a solution for.
* Latency was mentioned as a technical problem.

And there is always the technical reason of increased graphics memory usage (occupation).
 
I saw your brief exchange on reddit, abrasive huh - but he does have a point, I'm all for being blunt and it is incredibly irritating to see how AMD responds but if you had just said it nicely, amd diehards would one less thing to latch onto as 'evidence' of your bias. To be fair, they also gladly lap up anything you positive you say about an amd product ;)
Then we would have no weeping fanboys to laugh at....
 
  • Like
Reactions: Tyns
like this
The increased memory usage won't be large compared to the rest of the data in memory a 4k bitmap with 4B per pixel is only 33 MB
I agree (depending on AA level and method) thats why I'm trying to understand why they didn't just go with triple buffering to fix the frame rate drops.
 
I agree (depending on AA level and method) thats why I'm trying to understand why they didn't just go with triple buffering to fix the frame rate drops.

For triple buffering to work well you want to be able to output significantly more frames than the refresh rate requires, it's pretty hard at 100hz per eye
 
Yep, the extra lag from an extra framebuffer may need to be countered.

ps nice name sig :p
 
interesting. Didn't AMD promote the 480 GTX as a cheap way to get into VR etc etc?

Looks like the 1060 is whoopin its arse.....

I mean why keep spouting VR and wanting to be about VR, but not even be that great. I am interesting in VR, but I am waiting for it to become more mainstream. I sorry it might not take off just like 3D did and those stupid Nvidia glasses etc.
 
interesting. Didn't AMD promote the 480 GTX as a cheap way to get into VR etc etc?

Looks like the 1060 is whoopin its arse.....

I mean why keep spouting VR and wanting to be about VR, but not even be that great. I am interesting in VR, but I am waiting for it to become more mainstream. I sorry it might not take off just like 3D did and those stupid Nvidia glasses etc.

That's why if I buy it it's for what is out at the time.

I am also going to try it in person at the Microsoft Store in San Francisco while I am on business. Tempted to get it for Christmas for the kids. Get them off their asses.
 
Back
Top