Wolverine2349
Weaksauce
- Joined
- May 30, 2016
- Messages
- 74
I notice Command and Conquer Generals Zero Hour on my Core i7 12700K with all e cores and hyper threading disabled at 5GHz all 8 P-cores is always taking 100% of 1 core while all other cores are unused.
Now I know old games would never use more than 1 core, but it seems odd it would need 100% of 1 core. l I remember even back in the old days like 2002- early 2005 before multi core CPUs, that it was said you have a CPU bottleneck if the CPU was at almost 100% usage even back in the day??
So 1 CPU core is like 1 physical CPU on chip back in the day, except 1 CPU core of even a 10 year old Intel Core CPU or even a Core 2 Duo is so much much much much more powerful at same clock speed than Pentium 4 and Athlon XPs. Let alone by far the fastest cores on the market clock for clock being Alder Lake Performance cores.
So why would such an old game need to peg 100% of 1 super fast core that is like probably at least 20-30 times faster at same clock speeds than Pentium 4 Northwoods or Athlon XP back in the day.
Or is it the way modern Windows scheduler works with very old games that it just throws all 100% CPU usage of 1 CPU core at it regardless.
I know DOS Box is inefficient and uses ridiculous CPU cycles to emulate very very old games. But this game is a native Windows 32-bit game from 2003 which Windows 10 64-bit and all X86 CPUs through Wow6432 can run natively and is not using DOS box at all.
The game runs great all settings maxed out, but is interesting none the less.
Just curious why it would peg 100% use of such a powerful core when playing it?? When I remember it only using like 70-80% of a 3GHz P4 Northwood back in early 2000s. Does a much more powerful video card being a 3090 which is only used like 20% if that as opposed to an old Radeon 9800 Pro or GeForce 6800GT from 2005 make a big difference in CPU usage??
Mods if this should be in a different forum, please feel free to move it to where it should be.
Now I know old games would never use more than 1 core, but it seems odd it would need 100% of 1 core. l I remember even back in the old days like 2002- early 2005 before multi core CPUs, that it was said you have a CPU bottleneck if the CPU was at almost 100% usage even back in the day??
So 1 CPU core is like 1 physical CPU on chip back in the day, except 1 CPU core of even a 10 year old Intel Core CPU or even a Core 2 Duo is so much much much much more powerful at same clock speed than Pentium 4 and Athlon XPs. Let alone by far the fastest cores on the market clock for clock being Alder Lake Performance cores.
So why would such an old game need to peg 100% of 1 super fast core that is like probably at least 20-30 times faster at same clock speeds than Pentium 4 Northwoods or Athlon XP back in the day.
Or is it the way modern Windows scheduler works with very old games that it just throws all 100% CPU usage of 1 CPU core at it regardless.
I know DOS Box is inefficient and uses ridiculous CPU cycles to emulate very very old games. But this game is a native Windows 32-bit game from 2003 which Windows 10 64-bit and all X86 CPUs through Wow6432 can run natively and is not using DOS box at all.
The game runs great all settings maxed out, but is interesting none the less.
Just curious why it would peg 100% use of such a powerful core when playing it?? When I remember it only using like 70-80% of a 3GHz P4 Northwood back in early 2000s. Does a much more powerful video card being a 3090 which is only used like 20% if that as opposed to an old Radeon 9800 Pro or GeForce 6800GT from 2005 make a big difference in CPU usage??
Mods if this should be in a different forum, please feel free to move it to where it should be.