fightingfi
2[H]4U
- Joined
- Oct 9, 2008
- Messages
- 3,231
https://videocardz.com/newz/amd-rdna-2-to-support-microsoft-directx-12-ultimate
Awesome no more its nvidia crap to hear
Awesome no more its nvidia crap to hear
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
https://videocardz.com/newz/amd-rdna-2-to-support-microsoft-directx-12-ultimate
Awesome no more its nvidia crap to hear
DirectX Raytracing 1.1
DirectX Raytracing (DXR) brings a new level of graphics realism to video games, previously only achievable in the movie industry. The effects achievable by DXR feel more real, because in a sense they are more real: DXR traces paths of light with true-to-life physics calculations, which is a far more accurate simulation than the heuristics based calculations used previously.
We’ve already seen an unprecedented level of visual quality from titles that use DXR 1.0 since we unveiled it, and built DXR 1.1 in response to developer feedback, giving them even more tools with which to utilize DXR.
DXR 1.1 is an incremental addition over the top of DXR 1.0, adding three major new capabilities:
When to use inline raytracing
- GPU Work Creation now allows Raytracing. This enables shaders on the GPU to invoke raytracing without an intervening round-trip back to the CPU. This ability is useful for adaptive raytracing scenarios like shader-based culling / sorting / classification / refinement. Basically, scenarios that prepare raytracing work on the GPU and then immediately spawn it.
- Streaming engines can more efficiently load new raytracing shaders as needed when the player moves around the world and new objects become visible.
- Inline raytracing is an alternative form of raytracing that gives developers the option to drive more of the raytracing process, as opposed to handling work scheduling entirely to the system (dynamic-shading). It is available in any shader stage, including compute shaders, pixel shaders etc. Both the dynamic-shading and inline forms of raytracing use the same opaque acceleration structures.
Inline raytracing can be useful for many reasons:
Scenarios with many complex shaders will run better with dynamic-shader-based raytracing, as opposed to using massive inline raytracing uber-shaders. Meanwhile, scenarios that have a minimal shading complexity and/or very few shaders will run better with inline raytracing.
- Perhaps the developer knows their scenario is simple enough that the overhead of dynamic shader scheduling is not worthwhile. For example, a well constrained way of calculating shadows.
- It could be convenient/efficient to query an acceleration structure from a shader that doesn’t support dynamic-shader-based rays. Like a compute shader or pixel shader.
- It might be helpful to combine dynamic-shader-based raytracing with the inline form. Some raytracing shader stages, like intersection shaders and any hit shaders, don’t even support tracing rays via dynamic-shader-based raytracing. But the inline form is available everywhere.
- Another combination is to switch to the inline form for simple recursive rays. This enables the app to declare there is no recursion for the underlying raytracing pipeline, given inline raytracing is handling recursive rays. The simpler dynamic scheduling burden on the system can yield better efficiency.
If the above all seems quite complicated, well, it is! The high-level takeaway is that both the new inline raytracing and the original dynamic-shader-based raytracing are valuable for different purposes. As of DXR 1.1, developers not only have the choice of either approach, but can even combine them both within a single renderer. Hybrid approaches are aided by the fact that both flavors of DXR raytracing share the same acceleration structure format, and are driven by the same underlying traversal state machine.
Best of all, gamers with DX12 Ultimate hardware can be assured that no matter what kind of Raytracing solution the developer chooses to use, they will have a great experience.
Well, Vulkan has cross-platform ray tracing now too, so ray tracing is not solely DXR.Raytracing is DXR...
so the new dx 12 ultimate will it be able to fully support what new card are out now 5700 2070 2060 2080's or are we going to have to upgrade to get the shiny dx 12 sticker to tell us were now ready to game on?
Well, Vulkan has cross-platform ray tracing now too, so ray tracing is not solely DXR.
https://www.khronos.org/news/press/khronos-group-releases-vulkan-ray-tracing
It might be just my observation, but I have the feeling that Nvidia may have indirectly helped AMD achieve their real-time hybrid raytracing implementation through Microsoft with NV and MS' collaboration on DXR 1.0 and MS' decision to get that framework into their next XBox console since MS defined the requirements for AMD to follow in developing the custom APU for the new console.
I do wonder how de-noising will be handled by each vendor going forward. Turing can possibly use their tensor cores while RDNA2 might be able to do them using shaders.
There’s no intellectual capital in DXR. Its concepts have been common knowledge in the graphics industry for decades. The secret sauce is in the hardware implementation and nvidia certainly didn’t share that with AMD.
And of course it’s in everyone’s best interest to define a common api for raytracing (DXR/Vulkan RT). Just like for any other graphics feature that you want developers to use.
No games use tensors for denoising on Turing. RDNA will denoise just fine.
I made no such assertion. I just said the framework for the hardware implementation of DXR as well as VRS and Mesh Shading is very much based on Microsoft and Nvidia's work on them which naturally is based on the hardware capabilities that Nvidia has done with Turing. I never implied that Nvidia directly shared any proprietary technology with AMD. The DXR 1.0 API spec was developed by MS and NV after which MS used the resulting requirements in separately developing the custom GPU in the Xbox One Series X with AMD.
I made no such assertion. I just said the framework for the hardware implementation of DXR as well as VRS and Mesh Shading is very much based on Microsoft and Nvidia's work on them which naturally is based on the hardware capabilities that Nvidia has done with Turing. I never implied that Nvidia directly shared any proprietary technology with AMD. The DXR 1.0 API spec was developed by MS and NV after which MS used the resulting requirements in separately developing the custom GPU in the Xbox One Series X with AMD.
This isn't correct not even close. All DX12 cards support DXR. The difference lies in the implementation. Nvidia contributed no more to the API than AMD did.
What you can say though it's that Nvidia is the only one that ships hardware today that had DXR enabled. That's true. But more than that and you're getting into areas where it's just not accurate.
That's why I've said multiple times that's it's really disingenuous to say Nvidia developed Ray tracing in games it's not true at all. Imagination has done more to put tray tracing in games than even Nvidia.
What you can say though it's that Nvidia is the only one that ships hardware today that had DXR enabled. That's true. But more than that and you're getting into areas where it's just not accurate.
how quickly will RDNA2 arrive in the mid-range and low-end?
i.e. how quickly will we see the 5500 series replaced with something that has HDMI 2.1 and AV1 decode..
Xbox Series X and PS5how quickly will RDNA2 arrive in the mid-range and low-end?
i.e. how quickly will we see the 5500 series replaced with something that has HDMI 2.1 and AV1 decode..
Finally figured out why the demo looked so off and not impressive besides the rather cruddy frame rate. While the demo shows different types of reflections, curved surfaces, flat, uneven, rough etc. None really looked that great compared to typical raytraced images. THE BIGGEST thing I now notice was there is only one level of reflections. Objects that are reflecting, virtually everything, when seen on a reflective surface have no reflection in them -> reflection of a mirror of a mirror in other words would be blank and not showing each other reflections. Obviously in this demo you would have to have virtually infinite number of rays to calculate reflections of reflections (typical raytracing you would see this to a good level), at least 2 levels of reflections would probably have made this demo looked better.https://videocardz.com/newz/amd-rdna-2-to-support-microsoft-directx-12-ultimate
Awesome no more its nvidia crap to hear
In PS3/X360 generation we had games with 'next gen' visuals where everything was made out of bump mapping.Finally figured out why the demo looked so off and not impressive besides the rather cruddy frame rate. While the demo shows different types of reflections, curved surfaces, flat, uneven, rough etc. None really looked that great compared to typical raytraced images. THE BIGGEST thing I now notice was there is only one level of reflections. Objects that are reflecting, virtually everything, when seen on a reflective surface have no reflection in them -> reflection of a mirror of a mirror in other words would be blank and not showing each other reflections. Obviously in this demo you would have to have virtually infinite number of rays to calculate reflections of reflections (typical raytracing you would see this to a good level), at least 2 levels of reflections would probably have made this demo looked better.
I think the next Gen consoles is what going to help save RTG in AMD, RTX would stick if developers didnt had a alternative.. but since they will and they will develop for the RDNA tech, it should run pretty well on AMD whatever Raytracing the next gen games have.
The only thing is if nvidia decides to throw some marketing $ to be optimized for the tensor cores...
Though not directly related, Tensor Cores are being used for DLSS, which is a key part of making RT viable at this date.That would be a great waste of marketing dollars since RTX raytracing has nothing to do with tensor cores.
Though not directly related, Tensor Cores are being used for DLSS, which is a key part of making RT viable at this date.
https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-2-0-a-big-leap-in-ai-rendering/
What do you mean "develop for RDNA tech"?I think the next Gen consoles is what going to help save RTG in AMD, RTX would stick if developers didnt had a alternative.. but since they will and they will develop for the RDNA tech, it should run pretty well on AMD whatever Raytracing the next gen games have.
The only thing is if nvidia decides to throw some marketing $ to be optimized for the tensor cores...
Yes, performance much better than Turing since Turing has pretty much sucked with RT gaming from a performance prospective, lack of real innovative games or games where one would say -> I want that!.What do you mean "develop for RDNA tech"?
What are the killer performance improving features of RDNA2 which Turing does not already have? I highly doubt DX12 "Ultimate" would be released just before cards were released with even more features that needed extensions...
Besides, PS4 and XO had GCN GPU's with such killer feature: very efficient asynchronous compute capabilities. Nvidia cards did not properly support it for a long time and it still made no real difference so...
I really hope ray tracing performance of RDNA2 does not suck and is actually much better than Turing. If not then it will just hinder ray tracing implementation in games and it will be used to a lesser degree. Maybe more extremely optimized but as we already saw with "dynamically loaded textures" bullshit from PS3/X360 era these extreme optimization techniques do not always translate well in to PC world and definitely do not contribute to overall quality of games.
Yes, performance much better than Turing since Turing has pretty much sucked with RT gaming from a performance prospective, lack of real innovative games or games where one would say -> I want that!.
Minimum for RT should be like 1440p 100fps+ with max or near max settings and RT, 4K 60fps+. Preferrably 100fps+ at 4K but even normal games can't even achieve that for the most part with even a 2080Ti except maybe Doom Eternal and older games.