I'm not sure why these benefits (reasons) weren't mentioned. It almost seems disingenuous to frame a PC power-efficiency discussion solely around the electricity bill.
because this wasn't about power efficiency to begin with but people keep coming up with new bs to counter everything everyone says. Prove one thing wrong and they spout off about something else.
 
Heh, yea I get that.

I was commenting on that particular discussion, and more specifically, on the Youtube video linked on the same topic. I should have mentioned that.

As for the thread: Yea, it's encouraging that the AMD CPU's seem to have held up over time. I'm looking forward to Zen (although my eyes are on Polaris 10 and GTX 1070 for now).
 
My cpu is fine. I will be getting the new AMD cpu on release. Really im in no big hurry.
 
I think you misunderstood what I said. Take all of the red figures off the first chart. Then compare say the 2600K to the 4770K. There really isn't much difference in the two processors. This is proof that Intel hasn't really increased their processor power over the years. That's also why the AMD CPUs can hang right in there with the best that Intel has.

Alternative explanation: GPU bottleneck.
 
Have 8370 and play Doom or anything else i wanna' get into.No need to read 3 pages
 
Frankly those small FPS differences don't make or break my experience. I also lock my refresh rate at 60FPS because my monitor isn't too great. To each their own. I wouldn't be worried about 5-10fps...
 
i always snicker when i hear folks complaining about power usage...

i run my quad Opteron 61xx setup (48 K10 cores) on the High Performance plan... i run 8 cores at 3.5 GHz, 8 cores at 3.6 GHz and all the other slow cores at 2.1 Ghz. Just idling it sucks around 570 watts. i think it probably costs me about $12-$13 to run it a month. Less during the winter because it heats up my computer room and keeps things a cozy.

Anyone know if Doom would take advantage of all the cores on my machine?

I value power-efficiency, but not specifically for the minimal electricity savings.

1. Less heat in the room = improved comfort. Also: using the Air-Conditioner slightly less might factor into that electricity cost I've been ignoring. This of course will depend on where you live... but I HATE having a hot PC room.
2. Less heat (being produced) in the case
3. Less fan noise (potentially) for the same cooling efficacy
4. More environmentally friendly. I try to save power when possible.

I'm not sure why these benefits (reasons) weren't mentioned. It almost seems disingenuous to frame a PC power-efficiency discussion solely around the electricity bill.
 
i always snicker when i hear folks complaining about power usage...

i run my quad Opteron 61xx setup (48 K10 cores) on the High Performance plan... i run 8 cores at 3.5 GHz, 8 cores at 3.6 GHz and all the other slow cores at 2.1 Ghz. Just idling it sucks around 570 watts. i think it probably costs me about $12-$13 to run it a month. Less during the winter because it heats up my computer room and keeps things a cozy.

Anyone know if Doom would take advantage of all the cores on my machine?
Doom will run into a huge issue if it detects more than 12 threads. If I enable all my cores (14+HT) it runs about 30 FPS on the lowest settings. Turn off HT and disable 2 cores and it runs 60FPS on max settings.
 
Doom will run into a huge issue if it detects more than 12 threads. If I enable all my cores (14+HT) it runs about 30 FPS on the lowest settings. Turn off HT and disable 2 cores and it runs 60FPS on max settings.

Has anyone else seen this? If that's the case it could be an issue under Zen if you keep HT on.
 
Has anyone else seen this? If that's the case it could be an issue under Zen if you keep HT on.

It's an issue already, with DOOM. Guessing their multi-threaded code just wasn't written for that many threads, and it falls back to some crap "big lock" scheme. You can hardly blame a blatant software issue on highly threaded CPUs.
 
It's an issue already, with DOOM. Guessing their multi-threaded code just wasn't written for that many threads, and it falls back to some crap "big lock" scheme. You can hardly blame a blatant software issue on highly threaded CPUs.

Sounds like a bug. Realistically it's likely just a thread pool and can scale to no particular number.
 
My 3930K can run Doom like a joke. You do not need a powerhouse CPU for this game lol.....
 
I was surprised when I played it. Forgot to set my OC on my 8350 and wound up playing @4.0ghz. It ran great. Next time I'll go in @4.6ghz see what if any difference.
 
it got cool enough last night that I turned my clocks back up. fx8120@4560 and 280x @ 1220/1600 under vulkan 1080p ultra gave me 100+FPS avg. there were sections that it even got into the low 130s! lowest I saw was 65fps. man was I shock by how big of a difference vulkan makes!! as stated before this is just from watching the in-game overlay. but HOLY SHIT!
 
it got cool enough last night that I turned my clocks back up. fx8120@4560 and 280x @ 1220/1600 under vulkan 1080p ultra gave me 100+FPS avg. there were sections that it even got into the low 130s! lowest I saw was 65fps. man was I shock by how big of a difference vulkan makes!! as stated before this is just from watching the in-game overlay. but HOLY SHIT!

We been stuck with directx/opengl for to long way to long....
 
We been stuck with directx/opengl for to long way to long....

How have we been "stuck" with them? What alternatives were there?

By the way, the classic high-level versions of DirectX and OpenGL aren't going away. Microsoft released DirectX 11.3 alongside DX12, for instance.
 
How have we been "stuck" with them? What alternatives were there?

By the way, the classic high-level versions of DirectX and OpenGL aren't going away. Microsoft released DirectX 11.3 alongside DX12, for instance.
I think what he refers to is how limiting they were compared to DX12/Vulkan. Doom shows that hand over fist, especially with Vulkan. Us guys with FX 8 cores are finally seeing them get used how they were intended. I played the Demo with Vulkan and on my FX8350 @4.0 and my R9-290@1100/1400 I was running 120FPS all options turned up Except shadows (haven't tried higher so may no be a big deal) and all blur, radial and DOF, turned off (never use them). That is damned impressive. The max frametime I saw on my CPU was 10ms and considering I usually Game @4.66Ghz there is nothing negative I can say about DOOMs implementation of Vulkan. Heck I dare say it has to be the best lead off representation of a new API ever.
 
He says, with his powerhouse CPU. ;)

Ok I can see where I sounded hypocritical lol.... what I meant is even my 5 year old CPU can run the game just fine. I am betting that any modern day I5 and up can easily handle Doom 2016. That is really what I was trying to say.
 
I was surprised when I played it. Forgot to set my OC on my 8350 and wound up playing @4.0ghz. It ran great. Next time I'll go in @4.6ghz see what if any difference.

I think you are going to see diminishing returns at 4.6. This game really is GPU dependent more than anything it seems.
 
I think what he refers to is how limiting they were compared to DX12/Vulkan.

They aren't limiting, they just serve different goals. There are very good reasons why you would not want low-level access even in a bleeding edge 3D application. You'll still see games ship using DX 11.3 or OGL instead of DX12 or Vulkan.
 
They aren't limiting, they just serve different goals. There are very good reasons why you would not want low-level access even in a bleeding edge 3D application. You'll still see games ship using DX 11.3 or OGL instead of DX12 or Vulkan.

Those games have no performance issues where stuff as FPS or maybe very complex game engines which try to do stuff that was not possible before under DX/OpenGL now can.
And yes when you have 8 cores and the API only lets you use one for sending data to the GPU that is pretty lacklustre.
Where cpu performance is limited by IPC in those older API, we can not have any real improvements over the next decade in how our PC performs to say a console (consoles would scale much better on lesser hardware).

The idea that you don't "need" to use low level API is also something which does not benefit the gaming market. With more hardware available to run your game why would there not be an increase in sales ?
 
They aren't limiting, they just serve different goals. There are very good reasons why you would not want low-level access even in a bleeding edge 3D application. You'll still see games ship using DX 11.3 or OGL instead of DX12 or Vulkan.
No not different goals. It is a progression. DX12 is better in every way than DX11 in todays market. Previous versions of DX made some sense as when they were ushered in an abstraction layer was necessary because of the vast number of graphic manufacturers. Today we have 3, 2 of which have funds enough to make simple use of DX12/lowlevel APIs. The 3rd less funds but having consoles where low level API is the word, most of the works is done with/for them.

This fear mongering over DX12 is nothing more than that, fear mongering. The benefits are vast and apply to nearly everyone, more so those on the lower end. Vulkan has 5 year old cards producing framerates at level of high tier current cards, which is above and beyond anything we have seen to date. It has made processors of all ages viable longer with far less need to update and allowing far more consumers to buy games being their hardware is no longer a huge issue.
 
I think you are going to see diminishing returns at 4.6. This game really is GPU dependent more than anything it seems.
True, hell I only have a 60hz monitor but that max frame will probably be less than 10ms then with 4.6Ghz, but again already seeing 120fps 99% of the time so why bother, as you said.
 
This fear mongering over DX12 is nothing more than that, fear mongering.

It's not fear mongering. It is a fact. The two APIs will exist side by side. What is scary about that to you?

The idea that you don't "need" to use low level API is also something which does not benefit the gaming market. With more hardware available to run your game why would there not be an increase in sales ?

Don't ask me. Ask whichever developers stick with DX11 or OGL.

You two are very defensive, for some reason.
 
It's not fear mongering. It is a fact. The two APIs will exist side by side. What is scary about that to you?



Don't ask me. Ask whichever developers stick with DX11 or OGL.

You two are very defensive, for some reason.
Not defensive just love watching other squirm trying to invalidate a superior method because of their fear. My total system has seen great gains going to Win10 and using DX12/Vulkan. I see from your rig you are seeing nothing with either and that is why you have great disdain for them. Dx11 is dated and like others before will fade from existence, granted it will take some time. But being consoles are using both x86 and hardware being used in PCs DX12 has a very lucrative and promising future. Even the adoption rate of DX12 far surpasses earlier iterations.
 
Not defensive just love watching other squirm trying to invalidate a superior method because of their fear.

Then you live in an alternate universe, because nothing of the sort is happening.

You're going to be extremely shocked when "DirectX 11.3" is listed as a supported API on new games, I guess.
 
Then you live in an alternate universe, because nothing of the sort is happening.

You're going to be extremely shocked when "DirectX 11.3" is listed as a supported API on new games, I guess.
You're being obtuse just to obfuscate the actual argument. No one said only DX12/Vulkan going foward. We stated the facts: DX12/Vulkan is superior in every aspect, has thus far a greater adoption rate than any previous API, and Doom is a picture perfect example of this API, albeit in its infancy. Seems the only protractors are owners of hardware that do not thus far exhibit any positive results.
 
We stated the facts: DX12/Vulkan is superior in every aspect

This is an opinion, not a fact. DX12/Vulkan are simply tools that accomplish a specific objective. They are not objectively better or worse as tools than DX11.3 or OGL. They accomplish Goal A through Tradeoff B, just as DX11.3 and OGL accomplish Goal C through Tradeoff D. What you're saying is equivalent to saying programming in Assembly is always preferable to C++.

There are still going to be modern video games shipping using DX11.3 instead of 12. Both versions will exist. There will probably be a DX11.4 at some point, just as there will be a DX12.1 at some point. DX11 is not deprecated by DX12. OGL is not deprecated by Vulkan. High level graphics APIs will continue to exist and be used. They will exist alongside low level APIs, rather than be entirely replaced by them, as you seem to believe.
 
This is an opinion, not a fact. DX12/Vulkan are simply tools that accomplish a specific objective. They are not objectively better or worse as tools than DX11.3 or OGL. They accomplish Goal A through Tradeoff B, just as DX11.3 and OGL accomplish Goal C through Tradeoff D. What you're saying is equivalent to saying programming in Assembly is always preferable to C++.

There are still going to be modern video games shipping using DX11.3 instead of 12. Both versions will exist. There will probably be a DX11.4 at some point, just as there will be a DX12.1 at some point. DX11 is not deprecated by DX12. OGL is not deprecated by Vulkan. High level graphics APIs will continue to exist and be used. They will exist alongside low level APIs, rather than be entirely replaced by them, as you seem to believe.
Ok, try this, how many games in the last 2 years that released with both DX9 and DX11? You have some serious ignorance and smoke screening going on. Just look at Doom. The OpenGL path is probably one of the best in recent history, and yet Vulkan is better. Even looking at the basics of DX12, one can easily see how it is superior to any previous version. It takes time to shift the industry, and being development for most games takes 3-5 years, the transition we are seeing is unprecedented and with it comes a few less polished versions. But without a doubt DX12 will eclipse all other previous versions in short order. Add the fact of consoles and Microsofts Xbox one and you easily see DX12 dominating quickly.
 
This is an opinion, not a fact. DX12/Vulkan are simply tools that accomplish a specific objective. They are not objectively better or worse as tools than DX11.3 or OGL. They accomplish Goal A through Tradeoff B, just as DX11.3 and OGL accomplish Goal C through Tradeoff D. What you're saying is equivalent to saying programming in Assembly is always preferable to C++.

You are missing the point completely Mantle, DX12 and Vulkan all allow the graphics card to be fully used and the cpu is there just to send data (all of the cores all the time) unlike the older API this is somewhat improved situation where there is scaling on using multiple (beyond 4 - 5) cores and not suffer from negative scaling.
They allow developers to make access to all of the hardware in the way they need to, that allows bug fixing by the developer without drivers getting in the way.

You might as well argue that Visual Basic is good enough, that is an opinion ..... not fact
 
You are missing the point completely

I'm not missing anything. I'm not arguing that low-level graphics APIs don't offer better performance. I am telling you that low-level APIs do not replace the existing high-level APIs. They exist side by side. Microsoft stated this directly when they announced DX12.

Ok, try this, how many games in the last 2 years that released with both DX9 and DX11? You have some serious ignorance and smoke screening going on.

DirectX 11 replaced DirectX 9. DX12 does not replace DX11. Microsoft shipped DX11.3 in Windows 10 in conjunction with DX12, and stated that they intend to continue developing both.

The problem with your analogy is that you assume DX12 replaces DX11. It does not. It exists side by side with DX11. Some games will use 12, some will use 11.3.
 
Last edited:
I'm not missing anything. I'm not arguing that low-level graphics APIs don't offer better performance. I am telling you that low-level APIs do not replace the existing high-level APIs. They exist side by side. Microsoft stated this directly when they announced DX12.



DirectX 11 replaced DirectX 9. DX12 does not replace DX11. Microsoft shipped DX11.3 in Windows 10 in conjunction with DX12, and stated that they intend to continue developing both.

The problem with your analogy is that you assume DX12 replaces DX11. It does not. It exists side by side with DX11. Some games will use 12, some will use 11.3.
Lol. You are too funny. Moving the goal posts constantly. At some point tonight I am gonna illustrate and educate you on the topic at hand. No time now.
 
I'm not missing anything. I'm not arguing that low-level graphics APIs don't offer better performance. I am telling you that low-level APIs do not replace the existing high-level APIs. They exist side by side. Microsoft stated this directly when they announced DX12.



DirectX 11 replaced DirectX 9. DX12 does not replace DX11. Microsoft shipped DX11.3 in Windows 10 in conjunction with DX12, and stated that they intend to continue developing both.

The problem with your analogy is that you assume DX12 replaces DX11. It does not. It exists side by side with DX11. Some games will use 12, some will use 11.3.

But DX10, is a derivative of DX9 which was a derivative of DX8. So if you subtract DX11.3 from Windows 10, you get Windows -1.3. So by definition DX 12 is derivative of DX11.3.

Which if used with an appropriate video card constantly outputs 42. Most Vulcans know this.
 
But DX10, is a derivative of DX9 which was a derivative of DX8. So if you subtract DX11.3 from Windows 10, you get Windows -1.3. So by definition DX 12 is derivative of DX11.3.

Which if used with an appropriate video card constantly outputs 42. Most Vulcans know this.


head_explode-470x327.jpg
 
Back
Top