The Streamline is a Lie

Status
Not open for further replies.
Ghostwire: Tokyo uses PhysX, and that just came out.
Not exactly disproving me if that's the only game using PhysX this year. Nearly every game sold for the past decade has had physics in it. PhysX is being phased out. Pretty sure at this point Nvidia isn't helping developers by having it around.
To:
We can thank Mantle for accelerating Microsoft's timeline to release DX12, but that doesn't count as Graphics innovation though.

Try again? Show me 1 innovation AMD has created since 2007 (15 years), that wasn't a copy of something nVidia did first.
Mantle is the basis of Vulkan and that's a pretty big innovation. May not be a big deal for you since as far as I can remember DX12 and Vulkan were not performance improvements for Nvidia hardware until RTX was released. Fact is that AMD has working asynchronous compute since the creation of GCN while it took Nvidia a while to not lose performance when using it. AMD was also involved with HBM's creation, along with Samsung and Hynix. The only innovation that Nvidia has done is Ray-Tracing which is arguably garbage unless you play games like Quake 2 or Minecraft. DLSS was created specifically to allow games that could not normally do Ray-Tracing by literally lowering the resolution of your game and then using high quality textures to mimic higher resolution. In the process of this the RTX 2000 series of cards were largely ignored by Nvidia's customers because the Ray-Tracing hardware was taking so much silicon that games didn't see much of any improvements compared to the GTX 1080's and 1070's. The fact that Ray-Tracing is such a failure and requires DLSS to even have a chance at working is why Nvidia even created Streamline, to encourage developers to work on their technology so their investments into Ray-Tracing wasn't a waste.

I'm not for team Red or Green, and will gladly go team Blue if Intel makes affordable and capable hardware. Considering whats happened for the past 2 years both AMD and Nvidia can go to hell for all I care. But to say that AMD has been catching up to Nvidia is just fanatic fanboism. No, Gsync doesn't count. An advanced version of Vsync isn't special and Vesa did implement adaptive synchronization, which then AMD later on rebranded as FreeSync. Nobody gets credit for this technology that is just mostly meant to drive the sales of monitors.
 
No, Gsync doesn't count. An advanced version of Vsync isn't special and Vesa did implement adaptive synchronization, which then AMD later on rebranded as FreeSync. Nobody gets credit for this technology that is just mostly meant to drive the sales of monitors.
Adaptive sync predates VESA by a fair bit, the main function of the variable refresh rate was developed by MIT back in the 1960’s. CRT’s used it to adjust for the frequency interference from household AC, it was also used by vector displays as vectors took longer to draw. Arcade games also used it to keep the onscreen experience smooth.

There were a number of really bad attempts to bring it to the PC market that were very half assed before Nvidia made it actually work there. So while nobody should get credit for it at least they found a way to package and deliver it in a manner that worked.

I like what AMD has been accomplishing lately not necessarily what their doing but they have managed to bring the screws down on both teams blue and green. I just hope it doesn’t go to their heads and they fuck it all up by doing something stupid.
 
Not exactly disproving me if that's the only game using PhysX this year. Nearly every game sold for the past decade has had physics in it. PhysX is being phased out. Pretty sure at this point Nvidia isn't helping developers by having it around.

You're right, I just realized this. Seemed like a lot of recent games had it. Witcher 3, Fallout 4, Batman Arkham Knight. But those are all from 2015. Recent Assassin's Creed series games dropped the special PhysX options. I just looked up Borderlands 3, and that doesn't use it either. I know Metro Exodus had some PhysX specific features but that too is 3+ years old now.

I guess Nvidia is mainly pushing ray tracing since 2019 or so as well as DLSS. Though in many instances, the special Physx options often looked overdone so it might not be a huge loss. I believe UE4/5 has PhysX support natively, so it will probably be used but Nvidia specific graphic options might no longer be implemented in games.
 
Not exactly disproving me if that's the only game using PhysX this year. Nearly every game sold for the past decade has had physics in it. PhysX is being phased out. Pretty sure at this point Nvidia isn't helping developers by having it around.

Mantle is the basis of Vulkan and that's a pretty big innovation. May not be a big deal for you since as far as I can remember DX12 and Vulkan were not performance improvements for Nvidia hardware until RTX was released. Fact is that AMD has working asynchronous compute since the creation of GCN while it took Nvidia a while to not lose performance when using it. AMD was also involved with HBM's creation, along with Samsung and Hynix. The only innovation that Nvidia has done is Ray-Tracing which is arguably garbage unless you play games like Quake 2 or Minecraft. DLSS was created specifically to allow games that could not normally do Ray-Tracing by literally lowering the resolution of your game and then using high quality textures to mimic higher resolution. In the process of this the RTX 2000 series of cards were largely ignored by Nvidia's customers because the Ray-Tracing hardware was taking so much silicon that games didn't see much of any improvements compared to the GTX 1080's and 1070's. The fact that Ray-Tracing is such a failure and requires DLSS to even have a chance at working is why Nvidia even created Streamline, to encourage developers to work on their technology so their investments into Ray-Tracing wasn't a waste.

I'm not for team Red or Green, and will gladly go team Blue if Intel makes affordable and capable hardware. Considering whats happened for the past 2 years both AMD and Nvidia can go to hell for all I care. But to say that AMD has been catching up to Nvidia is just fanatic fanboism. No, Gsync doesn't count. An advanced version of Vsync isn't special and Vesa did implement adaptive synchronization, which then AMD later on rebranded as FreeSync. Nobody gets credit for this technology that is just mostly meant to drive the sales of monitors.

????
 
I asked around to a few friends who still work doing game development and they gave me this ranking

Quality of effects: Havok > Physx > Bullet

Ease of use: Bullet > Physx > Havok

They are hoping that Physx v5 finally launches and solves their gripes with it because as it currently stands Havok is just better for hard body effects, so it works better in FPS titles, but Physx does better at ambient effects and other visuals, but those more than not get turned down or off for most players so 9 times out of 10 Havok is the better tool for the job.
 
Last edited:
I like what AMD has been accomplishing lately not necessarily what their doing but they have managed to bring the screws down on both teams blue and green. I just hope it doesn’t go to their heads and they fuck it all up by doing something stupid.
I think that's been the case for a while now. Look at AMD's new 4000 and 5000 CPU's where AMD has priced them higher than Intel while being worse in performance than Intel. I haven't seen a good affordable mid range GPU from AMD since the RX 480's and 580's. We all knew that once AMD was in Intel's position that they too would just abuse their position. It's not like AMD and Nvidia haven't illegally worked with each other to price fix their products. This is why it's great that Intel is entering the market. Not great that Intel is supporting Nvidia's Streamline, because it just enforces Nvidia's position in the market. Makes sense for Intel to support it since Intel needs that mind share.
 
Fair deal, still no proof from a 14 year old story. Pass.
Did the DRAM price fixing lawsuits ever settle? A duopoly is much more likely to price fix. Just cause they settled doesn't mean nothing happened. Just means you can't say it did or didn't happen.
 
Mantle is the basis of Vulkan and that's a pretty big innovation. May not be a big deal for you since as far as I can remember DX12 and Vulkan were not performance improvements for Nvidia hardware until RTX was released.
An API for consoles and weak CPUs isn't Graphics innovation. Prove me wrong.

Fact is that AMD has working asynchronous compute since the creation of GCN while it took Nvidia a while to not lose performance when using it.
Not graphics innovation, so not going to bother fact-checking your troll-post.
AMD was also involved with HBM's creation, along with Samsung and Hynix.
Not graphics innovation, so not going to bother fact-checking your troll-post.
The only innovation that Nvidia has done is Ray-Tracing which is arguably garbage unless you play games like Quake 2 or Minecraft. DLSS was created specifically to allow games that could not normally do Ray-Tracing by literally lowering the resolution of your game and then using high quality textures to mimic higher resolution. In the process of this the RTX 2000 series of cards were largely ignored by Nvidia's customers because the Ray-Tracing hardware was taking so much silicon that games didn't see much of any improvements compared to the GTX 1080's and 1070's. The fact that Ray-Tracing is such a failure and requires DLSS to even have a chance at working is why Nvidia even created Streamline, to encourage developers to work on their technology so their investments into Ray-Tracing wasn't a waste.
And you say you aren't a fanboy?? lol that's comical
... But to say that AMD has been catching up to Nvidia is just fanatic fanboism.
Facts are facts, not fanboism.
No, Gsync doesn't count. An advanced version of Vsync isn't special and Vesa did implement adaptive synchronization, which then AMD later on rebranded as FreeSync.
We have Adaptive Sync in games thanks to Nvidia and [H]'s old article on Frametimes, prior to which everyone just looked at FPS.

Vesa didn't do SHIT for adaptive sync in games, their adaptive sync was a power saving feature for laptops, not even remotely the same tech or purpose.

Got any more smoke to blow up my butthole??
 
An API for consoles and weak CPUs isn't Graphics innovation. Prove me wrong.


Not graphics innovation, so not going to bother fact-checking your troll-post.

Not graphics innovation, so not going to bother fact-checking your troll-post.

And you say you aren't a fanboy?? lol that's comical

Facts are facts, not fanboism.

We have Adaptive Sync in games thanks to Nvidia and [H]'s old article on Frametimes, prior to which everyone just looked at FPS.

Vesa didn't do SHIT for adaptive sync in games, their adaptive sync was a power saving feature for laptops, not even remotely the same tech or purpose.

Got any more smoke to blow up my butthole??
AA547E71-E908-4448-B381-C9385D808241.jpeg
 
I think that's been the case for a while now. Look at AMD's new 4000 and 5000 CPU's where AMD has priced them higher than Intel while being worse in performance than Intel. I haven't seen a good affordable mid range GPU from AMD since the RX 480's and 580's. We all knew that once AMD was in Intel's position that they too would just abuse their position. It's not like AMD and Nvidia haven't illegally worked with each other to price fix their products. This is why it's great that Intel is entering the market. Not great that Intel is supporting Nvidia's Streamline, because it just enforces Nvidia's position in the market. Makes sense for Intel to support it since Intel needs that mind share.
Priced higher but cheaper platform, so they are in a situation where they can do that because they are still ahead in price/performance. In the vast majority of use cases both platforms are going to be Bottlenecked by some other factor, so more often than not the relatively minor performance differences are a wash.

But AMD has to raise prices a little more still to maintain their margins with TSMC’s price increases. Their existing 7nm contract was more or less exempt from their previous increase but their 5nm products will be under the new price structure.

Intel owns their supply and production chains so their fabrication costs haven’t gone up as much as AMD’s have so Intel can charge less, both Intel and TSMC are also raising rates a little to pay for all those new fabs their building, Intel having a far larger customer base can spread those increases out a lot further though.

I’m not looking forward to the price tags on the AMD 5nm product stack. I very much expect them to loose their price advantage there, hoping they can balance it out with a price/performance advantage but time will sort that question out soon enough.
 
An API for consoles and weak CPUs isn't Graphics innovation. Prove me wrong.
What API is for consoles? Also the newer API's do more than just allow faster graphics for weak CPU's.
Not graphics innovation, so not going to bother fact-checking your troll-post.

Not graphics innovation, so not going to bother fact-checking your troll-post.

And you say you aren't a fanboy?? lol that's comical

Facts are facts, not fanboism.
See all this? This is someone who doesn't have anything to add.

Here's Nvidia's Pascal and Maxwell cards not able to do Async Compute.
https://pcper.com/2015/08/dx12-gpu-and-cpu-performance-tested-ashes-of-the-singularity-benchmark/

How's HBM not graphics innovation?

We have Adaptive Sync in games thanks to Nvidia and [H]'s old article on Frametimes, prior to which everyone just looked at FPS.
Last time I remembered Nvidia owners were begging to buy FreeSync monitors due to cost of Gsync. It took how long before Nvidia supported FreeSync? Nvidia didn't create Adaptive Sync and didn't help push for it. Also, people still test games with FPS in mind.
Vesa didn't do SHIT for adaptive sync in games, their adaptive sync was a power saving feature for laptops, not even remotely the same tech or purpose.
Literally the first sentence from FreeSync wiki.
https://en.wikipedia.org/wiki/FreeSync

FreeSync is an adaptive synchronization technology for LCD and OLED displays that support a variable refresh rate, aimed at avoiding tearing and reducing stuttering caused by misalignment between the screen's refresh rate and the content's frame rate.
 
  • Like
Reactions: kac77
like this
What API is for consoles? Also the newer API's do more than just allow faster graphics for weak CPU's.

See all this? This is someone who doesn't have anything to add.

Here's Nvidia's Pascal and Maxwell cards not able to do Async Compute.
https://pcper.com/2015/08/dx12-gpu-and-cpu-performance-tested-ashes-of-the-singularity-benchmark/

How's HBM not graphics innovation?


Last time I remembered Nvidia owners were begging to buy FreeSync monitors due to cost of Gsync. It took how long before Nvidia supported FreeSync? Nvidia didn't create Adaptive Sync and didn't help push for it. Also, people still test games with FPS in mind.

Literally the first sentence from FreeSync wiki.
https://en.wikipedia.org/wiki/FreeSync

FreeSync is an adaptive synchronization technology for LCD and OLED displays that support a variable refresh rate, aimed at avoiding tearing and reducing stuttering caused by misalignment between the screen's refresh rate and the content's frame rate.
I don't mind most influencers but it's starting to get ridiculous. Saying things that are obviously provably wrong like "AMD doesn't have patents" or "AMD" doesn't make anything". It's a really warped take that's not backed up by anything.

I have both systems and can see value in everyone but this crap of literally rewriting history is ridiculous because it creates stupid people.
 
Ashes isn’t exactly an unbiased benchmark, but maxwell and pascal both did asynchronous compute through NVidia’s HyperQ interface that was launched alongside Kelper. The initial DX12 implementations for it were buggy AF though and crippled it’s performance. HyperQ and DX12 have a fundamentally different method for managing their resource barrier which is the process that converts a resource (or resources) from one type to another (such as a render target to a texture), and prevents further command execution until the GPU has finished doing any work needed to convert the resources as requested.

Nvidia Maxwell disabled Async compute at a driver level on their consumer cards and by Pascal they had it on as standard. Here are the Time Spy results from Async on, AMD sees larger gains, but Nvidia was pretty clear at the time and the link you have there makes mention of it that Nvidia traded DX12 performance for DX11 instead as more games were releasing with it still there and DX12 wasn’t offering much in the way of real-life benefits at that stage.

In regards to FreeSync, VESA and AMD worked for 2 years to develop a standard to counter GSync, and an additional 2 years to get it working reliably in the wild, it is only in the last few years (2020) that I would call the two comparable.

But the core function of FreeSync being the adaptive frame sync was in there long before but was not at all used for games I do remember back in the 90’s it being advertised on my 386 laptop as a battery-saving feature. And I distinctly remember the blinking cursor in Lotus Notes broke it and made it never lower the frame rate. It would chew the battery like bubble gum.

Edit: It’s worth pointing out that like Nvidia did with Maxwell, AMD was eventually required to disable Async compute on all their CGN 1.0 cards at the driver level (Radeon Software 16.4.2 onwards) and only left it on for CGN 1.1 and newer hardware.
 

Attachments

  • 97FC41B7-BAD7-404B-ADE9-55C4BE7CF724.png
    97FC41B7-BAD7-404B-ADE9-55C4BE7CF724.png
    75.8 KB · Views: 0
Last edited:
I don't mind most influencers but it's starting to get ridiculous. Saying things that are obviously provably wrong like "AMD doesn't have patents" or "AMD" doesn't make anything". It's a really warped take that's not backed up by anything.

I have both systems and can see value in everyone but this crap of literally rewriting history is ridiculous because it creates stupid people.
At the same time Duke did say, "The only innovation that Nvidia has done is Ray-Tracing" which is also pretty laughable.
 
At the same time Duke did say, "The only innovation that Nvidia has done is Ray-Tracing" which is also pretty laughable.
Factually they didn't do that either. SGI had ray tracing accelerators long before Nvidia ever did . Nvidia brought it to the consumer space first. That's true. But even that had more to do with SGI than Nvidia since they pretty much acquired most of the SGI talent before it shutdown.

The fact of the matter is most of the things people applaud Nvidia for it acquired. DLSS is the only thing I can think of recently they did entirely on their own.
 
Last edited:
Factually they didn't do that either. SGI had ray tracing accelerators long before Nvidia ever did . Nvidia brought it to the consumer space first. That's true. But even that had more to do with SGI than Nvidia since they pretty much acquired most of the SGI talent before it shutdown.

The fact of the matter is most of the things people applaud Nvidia for it acquired. DLSS is the only thing I can think of recently they did entirely on their own.
If we're going to split hairs like that then AMD hasn't done anything on their own either...
 
What API is for consoles? Also the newer API's do more than just allow faster graphics for weak CPU's.
Newer API's are not Mantle...
See all this? This is someone who doesn't have anything to add.
Just ignoring the troll-rants of a fanboi...
"Some players have had the game keep crashing upon startup, and one of the culprits can be something called Async Compute, or Asynchronous Compute. Here's how to turn the feature off. Essentially Async Compute is a feature that allows GPUs based on AMD's GCN to perform graphics and compute workloads at the same time. In some cases, it can lead to a boost in performance, but in this case, you'll want to toggle it off."

Programmable GPU doing programmable GPU things. Performance hits doesn't fit the definition of "Graphics Innovation" for me. YMMV.

https://www.gameskinny.com/y2zhw/ha...ally Async Compute is a,want to toggle it off.

How's HBM not graphics innovation?
A memory Innovation used by a GPU isn't Graphics Innovation.
Last time I remembered Nvidia owners were begging to buy FreeSync monitors due to cost of Gsync. It took how long before Nvidia supported FreeSync? Nvidia didn't create Adaptive Sync and didn't help push for it. Also, people still test games with FPS in mind.

Literally the first sentence from FreeSync wiki.
https://en.wikipedia.org/wiki/FreeSync

FreeSync is an adaptive synchronization technology for LCD and OLED displays that support a variable refresh rate, aimed at avoiding tearing and reducing stuttering caused by misalignment between the screen's refresh rate and the content's frame rate.
Literally came AFTER and IN RESPONSE TO G-Sync.
Ashes isn’t exactly an unbiased benchmark, but maxwell and pascal both did asynchronous compute through NVidia’s HyperQ interface that was launched alongside Kelper. The initial DX12 implementations for it were buggy AF though and crippled it’s performance. HyperQ and DX12 have a fundamentally different method for managing their resource barrier which is the process that converts a resource (or resources) from one type to another (such as a render target to a texture), and prevents further command execution until the GPU has finished doing any work needed to convert the resources as requested.

Nvidia Maxwell disabled Async compute at a driver level on their consumer cards and by Pascal they had it on as standard. Here are the Time Spy results from Async on, AMD sees larger gains, but Nvidia was pretty clear at the time and the link you have there makes mention of it that Nvidia traded DX12 performance for DX11 instead as more games were releasing with it still there and DX12 wasn’t offering much in the way of real-life benefits at that stage.

In regards to FreeSync, VESA and AMD worked for 2 years to develop a standard to counter GSync, and an additional 2 years to get it working reliably in the wild, it is only in the last few years (2020) that I would call the two comparable.

But the core function of FreeSync being the adaptive frame sync was in there long before but was not at all used for games I do remember back in the 90’s it being advertised on my 386 laptop as a battery-saving feature. And I distinctly remember the blinking cursor in Lotus Notes broke it and made it never lower the frame rate. It would chew the battery like bubble gum.

Edit: It’s worth pointing out that like Nvidia did with Maxwell, AMD was eventually required to disable Async compute on all their CGN 1.0 cards at the driver level (Radeon Software 16.4.2 onwards) and only left it on for CGN 1.1 and newer hardware.
Posting facts will get you labeled as a fanboi, so watch out for the trolls and haters.
 
Status
Not open for further replies.
Back
Top