Worst CPU's of all time?

486 chips were cheaper, more plentiful, and thanks to multipliers, all it took was a bios update in most cases and you could be running faster vs a whole new platform that would give you an incremental boost. Sure it was 'for the future', but no one really invested in it back then except people with money--specifically talking about those p5 60-66 versions.
Pentium was much faster when it launched though and to those who got it would give huge performance increase over any 486 available at that time. Your tests show it nicely. Actual 486 DX4 variant which came year later is still usually slower. Faster 486's like 5x86 were even newer.

BTW. Intel could keep 486 platform busy as state-of-art platform for much longer but it would be a bad decision for us and them because
  • It would not really make anything 486 related any cheaper - its new platform which made it more affordable than it otherwise would be
  • Intel was more worried about RISC threat - Pentium had to be released to fight off RISC in other markets and Intel could not afford to beat around the bushes. 586 performance with its platform improvements (64-bit 66MHz bus) was actually adequate to compete with first generation* of RISC
Of course fast 486 like AMD 5x86 was as fast as first Pentium but even its name doesn't suggest it really being from 486 era, now does it?

Yes, the socket changes are definitely funny when the pins are essentially the same or just many more vcc and ground.
Best example of Intel BS is people putting Coffee Lake in to Skylake Z170 motherboards... for DDR3. Can be done and works pretty well. I regret not going this route myself as that would make it hell of a fun platform to own and use.

LGA1156 was imho better than LGA155 and it felt like downgrade and artificially making non-K CPUs not overclock.
Later LGA11xx and even LGA12000 made even less sense and I didn't see anything changing other than amount of money in my bank account because of the need to always get more and more expensive motherboards.

Of course a 486dx4-200 wouldn't have been 2x as fast as a 486dx2-100. Even a 486dx2-66 was only as fast as a 486dx-50. I tested these personally when we had built our 486 and also got a prebuilt that was a 486dx-50 with 12 EISA slots in a nice heavy case, but was more expensive than our build so we returned it.

I never got a chance to mess with a 486 beyond a dx2-66 and dx-50, but I remember the amd 486dx4-80 was pretty much on par with the p5-60 for all practical 'real world' purposes. Remember that the software back then was still mainly 16-bit so the pentium was a bit of a 'meh' with it's ability to do 32-bit stuff. I stand that a 486dx4-200 would have been 2x faster than the Pentium 60 in 'real world' usage of the software at that time. In fact, someone else actually lived it and someone else did a whole set of benchmarks that reflect what I remember:
https://dependency-injection.com/diamond-stealth-64-dram/?comment-21481
https://dependency-injection.com/the-perfect-pentium/
There is nothing from 93 which is comparable to Pentium 60.

50Mhz bus worked fine when you made sure the cards you were using were also okay with it. The idea of creating additional busses required a whole re-work so backward compatibility became a problem. And back then stuff was really expensive so keeping your existing investment was important. Today, everything is just throwaway and that's stupid imo. Even with an order of magnitude faster systems from this previous era, regular every day usage isn't an order of magnitude faster, so there's been a lot of performance loss in scaling as well as bloat.
Something had to use and popularize fancy PCI buses though. Breaking with workarounds like VLB made the whole transition much smoother.

Yep, today's processors have changed the entire motherboard architecture even though 100Mhz is still used a refernce clock--I get that (wish cpuworld would note that better). Still it would have been neat to see where the old 486 architecture would have topped out even with the limitations. Because while all the improvements sound impressive on paper, add in the extra complexity of modern software and its impact is seriously lessened.
Things imho wouldn't be whole lot different. Sooner or later we would get new 64-bit bus like the first Pentium had and then DDR/QDR and breakaway from FSB completely.
Which itself makes introduction of 586 bus as early as it happened less necessary but in the end it wasn't a big deal to anyone. You could still get faster 486 CPUs for some time. Also from Intel.

PPro was far ahead of its time--most of the software was still 16-bit when it was released. I was there--win 95 and 98 was out, but those were pieces of 32-bit, much like NT. The software was far behind being 32-bit because most of them still had to work on 16-bit environments as well.

For the record, Doom was NEVER 32-bit. The ID game engine used some 16-bit and 32-bit stuff, but if it was fully 32-bit you could never launch it from 8-bit DOS.
Doom was 32-bit because DOS4GW was used exactly for the purpose of putting 386+ CPU in to 32-bit protected mode with flat 32-bit memory and 32-bit pointers - it is like running separate 32-bit OS on top of 16-bit DOS. Even if for memory saving purposes there were a lot of 16-bit and even 8-bit variables used pointers were still 32-bit.

That said Doom has to deal with DOS/BIOS interrupts and move in and out of 32-bit mode to and from 16-bit mode for I/O and because of that it isn't really fully 32-bit.

It might even be that this exact characteristic which puts a lot of stress on CPU ability to make fast switches between real and protected mode limits performance of Pentium Pro in Win9x which itself still relied on DOS/BIOS interrupts and I/O. I am not really sure if this is it or something else or this + something else. I know however we talk relative performance and relatively Pentium Pro showed huge performance improvements in pure 32-bit OSes and didn't show as much improvements in typical home OS at the time Windows 9x hence the moment someone mentions Pentium Pro we also have talk about 32-bit and Windows 9x not being fully 32-bit.

The Pentium Pro's microcode was optimized for 32-bit code vs 8 and 16-bit like previous processors were. The pentium 2 was simply a 'fix' for the mistake of putting the cache on the die and also correcting the microcode. A Pentium 2 had to have higher clock speeds to match a Pentium Pro in a fully 32-bit environment. It did, however, do better at 8 and 16-bit than the PPro due to the revised microcode. Again, I remember this because I lived it. What shocked me was the stupid 'cartridge' system scaled up all the way to 1Ghz before it went back to sockets again.
Mistake cost-wise, otherwise 1MB of on-die L2 cache made Pentium Pro really stand up to RISC crowd.
Microcode itself could be run-time updated on Pentium Pro by BIOS update and/or OS. Not sure if Intel ever bothered with using this feature or if it was even possible to fix issues they patched with Pentium 2 revision of the cores.

Pentium Pro was not as much slow in 16-bit as it was just not showing expected performance improvements. In either case it wasn't a deal breaker. Price of the platform and it being superseded by Slot-1 Pentium 2 was.

Alpha was one of the fastest NT platforms at its release--it was faster by far than anything else that could run NT in a desktop format. But this was a niche. The other RISC platforms didn't run Windows.
Not that it really mattered for home user anyways :)

Oh so you agree with a point, should I be impressed or surprised...or even care? lol
Definitely you should not care :)

I stopped building anything after our Cyrix P166+ based system as it was enough. I do have an HP or two with P3 processors in them as well as an IBM or two with a P3, several P4s, and all my Pentium Pros as well as our original 486dx-33 and 486dx2-66. I even still have the first PC that introduced me to the platform--the IBM PS/2 30-286. We modded that quite heavily adding SCSI, ethernet, and a 486SLC upgrade before it stopped being used and started sitting. I forgot I have a 486sx too--that was the original celeron imo--a chip with something disabled (the fpu). It started a trend that exists to this very day. All of these later systems besides the 486dx2-66 and P166+ were from business as hotel property management systems started upgrade cycles which led to a lot of working hardware ending up in utility chases until I rescued them a decade later and put them back to work.
I myself ate scraps found on the street until Pentium D which was my first bought new system and I got it new because I wanted dual-core badly and no one had one used to sell at that time 🙃

Anyways, price drops which now obsolete platforms experienced after introducing newer platforms meant that without much money or proper ways to get money people like me could get older but still workable hardware. 386 systems became much cheaper thanks to 486 and 486 became much cheaper thanks to Pentium which itself dropped in price significantly after everyone was amazed by Pentium 2. Therefore I do not see introduction of Pentium as bad thing. If you look at computer prices in eg. 1995 then getting Pentium 66 made absolutely no sense vs. going 486 with high multipliers - true. It didn't meant Pentium even in its lowest form didn't have its place... Quake and many other 3D games ran on it quite much better than on any 486 for example.

ps.
8-bit DOS
DOS was always 16-bit
Size of the memory segments and thus mostly used pointers were 16-bit.
Of course since original PC and XT used 8088 with 8-bit data paths programmers were incentivized to do as much 8-bit optimizations as possible but this itself doesn't really make DOS 8-bit.
 
Pentium was much faster when it launched though and to those who got it would give huge performance increase over any 486 available at that time. Your tests show it nicely. Actual 486 DX4 variant which came year later is still usually slower. Faster 486's like 5x86 were even newer.

BTW. Intel could keep 486 platform busy as state-of-art platform for much longer but it would be a bad decision for us and them because
  • It would not really make anything 486 related any cheaper - its new platform which made it more affordable than it otherwise would be
  • Intel was more worried about RISC threat - Pentium had to be released to fight off RISC in other markets and Intel could not afford to beat around the bushes. 586 performance with its platform improvements (64-bit 66MHz bus) was actually adequate to compete with first generation* of RISC
Of course fast 486 like AMD 5x86 was as fast as first Pentium but even its name doesn't suggest it really being from 486 era, now does it?


Best example of Intel BS is people putting Coffee Lake in to Skylake Z170 motherboards... for DDR3. Can be done and works pretty well. I regret not going this route myself as that would make it hell of a fun platform to own and use.

LGA1156 was imho better than LGA155 and it felt like downgrade and artificially making non-K CPUs not overclock.
Later LGA11xx and even LGA12000 made even less sense and I didn't see anything changing other than amount of money in my bank account because of the need to always get more and more expensive motherboards.


There is nothing from 93 which is comparable to Pentium 60.


Something had to use and popularize fancy PCI buses though. Breaking with workarounds like VLB made the whole transition much smoother.


Things imho wouldn't be whole lot different. Sooner or later we would get new 64-bit bus like the first Pentium had and then DDR/QDR and breakaway from FSB completely.
Which itself makes introduction of 586 bus as early as it happened less necessary but in the end it wasn't a big deal to anyone. You could still get faster 486 CPUs for some time. Also from Intel.


Doom was 32-bit because DOS4GW was used exactly for the purpose of putting 386+ CPU in to 32-bit protected mode with flat 32-bit memory and 32-bit pointers - it is like running separate 32-bit OS on top of 16-bit DOS. Even if for memory saving purposes there were a lot of 16-bit and even 8-bit variables used pointers were still 32-bit.

That said Doom has to deal with DOS/BIOS interrupts and move in and out of 32-bit mode to and from 16-bit mode for I/O and because of that it isn't really fully 32-bit.

It might even be that this exact characteristic which puts a lot of stress on CPU ability to make fast switches between real and protected mode limits performance of Pentium Pro in Win9x which itself still relied on DOS/BIOS interrupts and I/O. I am not really sure if this is it or something else or this + something else. I know however we talk relative performance and relatively Pentium Pro showed huge performance improvements in pure 32-bit OSes and didn't show as much improvements in typical home OS at the time Windows 9x hence the moment someone mentions Pentium Pro we also have talk about 32-bit and Windows 9x not being fully 32-bit.


Mistake cost-wise, otherwise 1MB of on-die L2 cache made Pentium Pro really stand up to RISC crowd.
Microcode itself could be run-time updated on Pentium Pro by BIOS update and/or OS. Not sure if Intel ever bothered with using this feature or if it was even possible to fix issues they patched with Pentium 2 revision of the cores.

Pentium Pro was not as much slow in 16-bit as it was just not showing expected performance improvements. In either case it wasn't a deal breaker. Price of the platform and it being superseded by Slot-1 Pentium 2 was.


Not that it really mattered for home user anyways :)


Definitely you should not care :)


I myself ate scraps found on the street until Pentium D which was my first bought new system and I got it new because I wanted dual-core badly and no one had one used to sell at that time 🙃

Anyways, price drops which now obsolete platforms experienced after introducing newer platforms meant that without much money or proper ways to get money people like me could get older but still workable hardware. 386 systems became much cheaper thanks to 486 and 486 became much cheaper thanks to Pentium which itself dropped in price significantly after everyone was amazed by Pentium 2. Therefore I do not see introduction of Pentium as bad thing. If you look at computer prices in eg. 1995 then getting Pentium 66 made absolutely no sense vs. going 486 with high multipliers - true. It didn't meant Pentium even in its lowest form didn't have its place... Quake and many other 3D games ran on it quite much better than on any 486 for example.

ps.

DOS was always 16-bit
Size of the memory segments and thus mostly used pointers were 16-bit.
Of course since original PC and XT used 8088 with 8-bit data paths programmers were incentivized to do as much 8-bit optimizations as possible but this itself doesn't really make DOS 8-bit.
Everything is faster when it launches or it doesn't launch. And I still remember that while faster, it wasn't a whole lot faster for those of us that used computers daily, so it was a 'wait and see' upgrade, not a 'holy cow that's faster and we need that' upgrade. Again, if you absolutely had to (architect using CAD all day, etc), you had the money for the roi. But for the rest of us, it was just something that was pretty to look at and that's it.

Don't confuse release dates with actual availability. Pentium boards were painfully impossible to get since Intel was the only one making them, the P5 was stupid expensive, and all the while the 486 clones were right there in a matter of months ready to go and motherboard bioes were revamped easily.

There might not be nothing you remember from 1993 that compares to a P60, but I recall we already spec'd out our Cyrix P166+ or even bought it by then--and it would stomp a P60.

PCI was a stupid pinout that just 'stuck a finger up and FU' to all the existing buses at the time when they could have easily used the VLB approach and connector for cards and allowed existing investments to continue working. This heavy handed approach did allow fully 32-bit 486 platforms to exist with 32-bit memory and PCI and the 486 CPU, but these were not significantly faster than the same 486es with 8-bit memory modules and isa versions of the pci cards. Forcing the entire userbase to choose one bus or another was an ahole move imo as I still remember that sting. Prior to this you could move your cards from one system to another and carry your investment forward. Now it was trash everything and start over, which today is the norm unfortunately.

Eventually there would be a change and a development for sure, but I don't think it would have been the P60--it would have had to be something much better to compete with the 486 units that would have had things on them like a hsf fan for the many multiple multipliers that could have run with forced cooling--cooling which didn't even exist on the original 486 series.

Doom using protected mode to become a 32-bit software is like Windows 3.1 using protected mode to become 32-bit--it still isn't fully 32-bit and generally by definition software is deemed to be xx-bit if it is at least a minimum xx-bit. At least that's the definition from that era that I recall because win3.1, win95, win98se were all deemed not 32-bit, even with win32s. Only NT was deemed 32-bit and on a Pentium Pro it felt it too vs the others.

Interesting how there was a way to update the microcode on the PPro. But I'm sure it had the same effects a lot of the recent Intel patches have had on performace, hence back then it would have been too significant to be practical.

The Pentium Pro was definitely slower in 98se than a comparable P133 when it was hitting 16-bit stuff. It was only when something 32-bit was thrown at it did it shine. Mp3 encoding was quite fast in comparison from what I recall. I ran my IBM PP180 and PP200 machines along side an IBM P133 as well as the Cyrix P166+. No doubt the Intel products were better on cpu intensive stuff, but the PP180/200 was a disappointment since the much higher clock speeds didn't translate to expected performance for that clock unlike the later Pentium 200 that did since clock speed was the measure of speed back then.

Home users are what ruined computers imo. The junk started arriving in the supply chain with the heralding of the arrival of the home computer. Packard Bell and HP were the top brands to blame for this from my recollection.

lol, so you do have a sense of humor. :)

Quake was definitely Pentium if you wanted to be competitive. The 486 and even Pentium alternatives were terrible by comparison, even with a good video card.

That's where I think our definitions differ since I recall that the xx-bit was when 100% of the code met the xx-bit threshold to be xx-bit.
 
Everything is faster when it launches or it doesn't launch. And I still remember that while faster, it wasn't a whole lot faster for those of us that used computers daily, so it was a 'wait and see' upgrade, not a 'holy cow that's faster and we need that' upgrade. Again, if you absolutely had to (architect using CAD all day, etc), you had the money for the roi. But for the rest of us, it was just something that was pretty to look at and that's it.
Same could be said for most Intel/AMD CPU releases.
Most of them didn't even begin to approach performance jump that happened from 486 to 586 - only maybe Pentium D to Core 2.

Don't confuse release dates with actual availability. Pentium boards were painfully impossible to get since Intel was the only one making them, the P5 was stupid expensive, and all the while the 486 clones were right there in a matter of months ready to go and motherboard bioes were revamped easily.
Not sure actual availability but back then distribution wasn't as quick as it is today for sure.
By the time you got computer press with Pentium benchmarks you could probably plan trip to bigger city with better equipped shop or find phone numbers and order Pentium system and had it delivered in few weeks.
Last CPU I got, Core i5 13600K I ordered on Thursday on its release day and on Saturday before noon I was putting it together.

If only it always worked like that... then I could perhaps have PS5 much much sooner 🙃

There might not be nothing you remember from 1993 that compares to a P60, but I recall we already spec'd out our Cyrix P166+ or even bought it by then--and it would stomp a P60.
Yeah sure, you certainly had Cyrix P166+, 133MHz CPU for Pentium boards. It being released on February 5th 1996 doesn't mean its availability was as bad as Intel's and if you got it three years earlier then maybe back then things ran backward 🙃

Even Cyrix 5x86 was released much later than Pentium - in 1995. Cyrix was very slow to join the 586 party in any shape or form.

PCI was a stupid pinout that just 'stuck a finger up and FU' to all the existing buses at the time when they could have easily used the VLB approach and connector for cards and allowed existing investments to continue working. This heavy handed approach did allow fully 32-bit 486 platforms to exist with 32-bit memory and PCI and the 486 CPU, but these were not significantly faster than the same 486es with 8-bit memory modules and isa versions of the pci cards. Forcing the entire userbase to choose one bus or another was an ahole move imo as I still remember that sting. Prior to this you could move your cards from one system to another and carry your investment forward. Now it was trash everything and start over, which today is the norm unfortunately.
Was VLB even endorsed by Intel to begin with?
VLB looks like a hacky "let's just connect cards to CPU bus using this cheap connector from MCA and make cards and CPU worry about 'stuff'" solution to mitigate performance bottleneck and not proper bus standard.
Same manufacturers which used it for 486 could add through glue logic chips add it to Pentium systems and some actually did. It quickly became obsolete when industry moved to superior PCI.
PCI was superior because it was proper bus standard, not because of its bandwidth - it could be as good on VLB as it was on PCI.

Eventually there would be a change and a development for sure, but I don't think it would have been the P60--it would have had to be something much better to compete with the 486 units that would have had things on them like a hsf fan for the many multiple multipliers that could have run with forced cooling--cooling which didn't even exist on the original 486 series.
Intel could not produce much faster 486 as process node at the time was limited to ~66MHz
Intel could make faster CPU for 486 bus like original Pentium with perhaps larger caches to mitigate bus bottleneck. Such thing would be however even more expensive and still not as fast as what we got and there would be even less incentives to get it. Especially later when process node allowed faster 486s variants no one would get such 'Pentium' just like no one got Pentium Overdrive.

Doom using protected mode to become a 32-bit software is like Windows 3.1 using protected mode to become 32-bit--it still isn't fully 32-bit and generally by definition software is deemed to be xx-bit if it is at least a minimum xx-bit.
So if I get source code of any program or OS and find even one instance of 8-bit variable in there does it mean it is 8-bit software/OS? 🤯
Bitness is defined by width of pointers...

At least that's the definition from that era that I recall because win3.1, win95, win98se were all deemed not 32-bit, even with win32s. Only NT was deemed 32-bit and on a Pentium Pro it felt it too vs the others.
...and Win9x used 32-bit pointers and flat memory space, not 16-bit segmented memory.
Win32s was workaround for Windows 9x to run Windows 32-bit software and as far as program was concerned it ran in 32-bit Windows - because Win32s did configure CPU and more specifically its MCU to 32-bit.
You could literally run 32-bit software on 16-bit Windows and get performance which was the same as if you ran it on Windows NT. Say you had to add million 32-bit numbers together. You could do that with 386DX on Win32s at least twice as fast as on the same Windows using 16-bit... unl;ess perhaps you wrote your application as 16-bit but used Assembler to force calculations to be 32-bit... but then again would this truly be 16-bit program?

Anyways, Windows 9x had lots of legacy 16-bit code and used BIOS interfaces but it doesn't mean it was 16-bit OS

Doom - and I already wrote it - was 32-bit as it used 32-bit pointers. The only 16-bit part of it was jumping to 16-bit real mode to issue BIOS/DOS calls.

Interesting how there was a way to update the microcode on the PPro. But I'm sure it had the same effects a lot of the recent Intel patches have had on performace, hence back then it would have been too significant to be practical.
Intel has no interest in making already sold CPUs faster - also because its ridiculously expensive to validate that any changes didn't introduce new bugs and easy to introduce new bugs.
Security patches tend to make things slower and same is usually true for other bugfixes.

True beauty of Pentium Pro microcode system which is still used up to this day is that end user doesn't really know if they run upgraded microcode or not. If not BIOS update then OS update will update the microcode.

The Pentium Pro was definitely slower in 98se than a comparable P133 when it was hitting 16-bit stuff. It was only when something 32-bit was thrown at it did it shine. Mp3 encoding was quite fast in comparison from what I recall. I ran my IBM PP180 and PP200 machines along side an IBM P133 as well as the Cyrix P166+. No doubt the Intel products were better on cpu intensive stuff, but the PP180/200 was a disappointment since the much higher clock speeds didn't translate to expected performance for that clock unlike the later Pentium 200 that did since clock speed was the measure of speed back then.
I call BS on this.
Windows 9x and DOS benchmark indicate Pentium Pro is faster than Pentium MMX which itself is universally faster than Pentium without MMX - due to larger caches and better instruction decoder, not so much because of MMX which was mostly attributed for performance increase by marketing.

The whole Pentium Pro being Windows NT beast was due to some specialized software running much faster on it - and with 1MB L2 cache you would expect some programs would show significant performance improvements.

Always when something interesting is found people - especially those which considers themselves as experts - will put out some explanations. Often these speculations, even if not valid, transform over time in to facts and here it seems they transformed in to "My PP180 was slower than P133" fake memory. Might be slower than Cyrix P166+ though because this particular processor was apparently faster than light itself 🤪

And BTW Cyrix had more performant INT/ALU implementation of Post-RISC and per-clock should be faster than Pentium Pro in some programs. Cyrix didn't make their CPU with on-die 1MB L2 and has lower clock speeds and weak FPU so you didn't hear like how great Cyrix CPUs were for Windows NT...

Home users are what ruined computers imo. The junk started arriving in the supply chain with the heralding of the arrival of the home computer. Packard Bell and HP were the top brands to blame for this from my recollection.
Ruined what exactly? 😪

IBM PC was made to tap in to home computer market. It became hit in organizations (IBM but... cheap... releatively speaking - what not to love so much to computerize your whole office with?) which made it also hit for home users.
You can not thus say home users ruined anything - home users was the intended target. Only price and availability made PC dominate other markets.


That's where I think our definitions differ since I recall that the xx-bit was when 100% of the code met the xx-bit threshold to be xx-bit.
Code is N-bit when its pointers are N-bit.
Its a little bit more complicated for various edge cases like using 32-bit opcodes in DOS programs but in this case its academic discussion - you would still need at the very least 386SX to run such program.

For your run of the mill Windows 11 applications you have those which have 32-bit pointers, the so called 32-bit applications and those which use 64-bit pointers also called 64-bit applications.
I could write and compile x86_64 application that does only use 8-bit variables and and pointers to 8-bit stuff but it would still be 64-bit application.
 
There is nothing from 93 which is comparable to Pentium 60.
Pretty sure the IBM PowerPC 601 CPU crushed the original Pentium clock-for-clock.
I still have a 601 clocked at 100MHz that runs circles around a Pentium 100MHz, and both would have been released around the same time in 1993.

There was quite the price difference between them, though, so Intel did win in terms of cost/performance at the time.
 
Pretty sure the IBM PowerPC 601 CPU crushed the original Pentium clock-for-clock.
I still have a 601 clocked at 100MHz that runs circles around a Pentium 100MHz, and both would have been released around the same time in 1993.

There was quite the price difference between them, though, so Intel did win in terms of cost/performance at the time.
And the platform- you couldn’t get PowerPC as easily either. Somehow I want to say there were shortages or parts issues too…?
 
Intel Atom Z530. Slow as balls and the GMA-500 was worse than the 950. Basically the PowerVR SGX-535 but for use with computers. Whee....
 
Vote is definitely with the first-gen Williamette RDRAM P4s.

Was on AXPs at the time and just laughed all over their silly scores in 3DMark and Aquamark. Great times.

.... then c2d showed up :sick:
 
Pretty sure the IBM PowerPC 601 CPU crushed the original Pentium clock-for-clock.
I still have a 601 clocked at 100MHz that runs circles around a Pentium 100MHz, and both would have been released around the same time in 1993.

There was quite the price difference between them, though, so Intel did win in terms of cost/performance at the time.
PowerPC could not run X86 software thus it is different category. I was specifically talking x86_32.

Have any concrete benchmarks showing how PowerPC crushes Pentium?

From my past research and various comparisons it was more like PowerPC had similar INT/ALU performance to 68060 and both had similar performance to Cyrix 6x86 - which were ~20-25% faster per clock than Pentium. It is not enough to claim Pentium die ever received any blunt force trauma.

FPU-wise situation is more complicated. I can confidently say Motorola 68060 and Cyrix 6x86 remain very close match.
PowerPC vs Pentium are more comparable though PowerPC seemed to fare better in typical industry standard benchmark while Pentium had big potential to be unlocked by hand coded assembly optimization which software often took advantage of and in this case Pentium closed performance gap and had comparable FPU performance. PowerPC might still have been slightly faster but less than INT/ALU advantage and I am not really too convinced* it was faster.

In other words the way I see it PowerPC was faster than Pentium but the performance difference was not that big.

Pentium Pro reduced the performance gap and PowerPC was had very comparable performance to Intel and later also AMD counterparts. In fact PowerPC G5 had pretty much the same performance as AMD K8. G5 was much smaller but K8 used much less power... and price difference between G5 and K8 based system was that to buy G5 computer you had to have good management job and to have K8 based computer K8 you could be job-less students and bother to collect beer cans 🤪

*) My comparisons are mostly based on various Amiga benchmarks. It might not be the best implementation of this CPU and Amiga notoriously had graphic card bottlenecks... oh the irony 🫣
 
The Pentium Pro was definitely slower in 98se than a comparable P133 when it was hitting 16-bit stuff. It was only when something 32-bit was thrown at it did it shine. Mp3 encoding was quite fast in comparison from what I recall. I ran my IBM PP180 and PP200 machines along side an IBM P133 as well as the Cyrix P166+. No doubt the Intel products were better on cpu intensive stuff, but the PP180/200 was a disappointment since the much higher clock speeds didn't translate to expected performance for that clock unlike the later Pentium 200 that did since clock speed was the measure of speed back then.

http://www.rcollins.org/ddj/Aug98/Aug98.html
 
I didn’t enjoy the AMD K6-2.

I will respectfully disagree with you on this - the K6-2 and K6-3 CPUs I owned were all fantastic with the K6-2/266 and K6-3/450 being particular standouts. Bang-for-the-buck was off the chain with these, and the K6-2/266 ran so damn cool it almost didn't even need a fan on it.

I will also add that I used them exclusively with Tyan Super-7 motherboards AND I was never a fan of Quake (motion sickness). Shitty motherboards were very much the bane of the Super 7 platform, and anything that relied on the FPU extensively (like Quake) definitely ran better on the Intel stuff. That being said... running better on the Intel did not mean it ran poorly on the AMD AND there really weren't all that many games or apps that relied quite so heavily on the FPU anyway.
 
Worst is a very subjective adjective. I would say that the fact that Bulldozer was slower than Phenom II has to put it in the top 5. Did it not work at all? No. But IMO a successor CPU should never be slower than its predecessor MHz for Mhz.
When I upgraded from Phenom II 965 to a FX-8350 it was significantly faster but I remember a lot of people were complaining the FX-8150 was meh...
 
When I upgraded from Phenom II 965 to a FX-8350 it was significantly faster but I remember a lot of people were complaining the FX-8150 was meh...
well they were more like 4 core/8 thread chips, than the true 8 core they claimed to be. mine always worked fine for what i used them for. there was barely any difference between the 8150 and 8350 though...
 
I will respectfully disagree with you on this - the K6-2 and K6-3 CPUs I owned were all fantastic with the K6-2/266 and K6-3/450 being particular standouts. Bang-for-the-buck was off the chain with these, and the K6-2/266 ran so damn cool it almost didn't even need a fan on it.

I will also add that I used them exclusively with Tyan Super-7 motherboards AND I was never a fan of Quake (motion sickness). Shitty motherboards were very much the bane of the Super 7 platform, and anything that relied on the FPU extensively (like Quake) definitely ran better on the Intel stuff. That being said... running better on the Intel did not mean it ran poorly on the AMD AND there really weren't all that many games or apps that relied quite so heavily on the FPU anyway.
It's really the motherboards that make or break the K6-2 and K6-3 ownership experience. To some degree that's true to this day. However, the days of multiple chipset brands and models were much more difficult on the consumer as the 3rd party chipsets were hot garbage most of the time.
 
It's really the motherboards that make or break the K6-2 and K6-3 ownership experience. To some degree that's true to this day. However, the days of multiple chipset brands and models were much more difficult on the consumer as the 3rd party chipsets were hot garbage most of the time.
Agreed. Especially if you were on a budget. The amount of time spent convincing SiS or other chipsets to play nice just to save some $$…. And then even the high end ones were touchy.
 
Via Nano is another turd. Was used in some netbooks when that shitshow of a fad was exploding. It felt like it was Via’s last gasp of breath, as you didn’t really see anything from them otherwise.
 
When I upgraded from Phenom II 965 to a FX-8350 it was significantly faster but I remember a lot of people were complaining the FX-8150 was meh...
See I upgraded from a 1100T (still have it) to an 8150 (still have it). Was not that impressive for me. I later when to a 2500k (four cores, no HT) and it was significantly faster.
 
See I upgraded from a 1100T (still have it) to an 8150 (still have it). Was not that impressive for me. I later when to a 2500k (four cores, no HT) and it was significantly faster.

To each their own and probably depends on what you upgraded from.

I went from the Phenom 9500 paired with a GTX 275 to a Phenom II 965 initially with the GTX 275 which felt like a huge jump (later paired with GTX 470s in SLI) and then from the 965 to the FX-8350 (later paired with GTX 670s in SLI). Each jump to the new CPU was with the older GPU (later upgrading the GPU) and noticed a significant increase in FPS with just the CPU upgrade.

I think it had to do with CPU bottlenecks compared to the GPUs they were paired with (i.e. Phenom 9500 w/ GTX 275 to Phenom II 965 and then Phenom II 965 w/ GTX 470s in SLI to a FX-8350).

In terms of worst CPU my vote goes to the Phenom 9500. It was terrible compared to Intel's quad core at the time (Q6600) and was quickly a cpu bottleneck.
 
Last edited:
Either Prescott Pentium 4s or Bulldozer. The Prescott Pentium 4 was a crazy power hog and unable to match AMD Athlon 64 and X2 in most areas.

Bulldozer was the same thing for AMD. It could not match Intel's Core i5/i7, and in some cases, Bulldozer was worse than the previous Phenom II chips. AMD really tried to take multi-threading to the extreme, and in the process, pulled a Pentium 4 where the chip was hot and power hungry.
 
I owned an FX 8320 (Piledriver). It sucked. It ran super hot and the IPC and ST performance was worse than my previous Phenom II 965BE.
I owned a fx6300 for $80. It worked just fine. There are cpus listed here that aren't actually functional or stable. This is worst cpus of "all time". Not mediocre.
 
Either Prescott Pentium 4s or Bulldozer. The Prescott Pentium 4 was a crazy power hog and unable to match AMD Athlon 64 and X2 in most areas.

Bulldozer was the same thing for AMD. It could not match Intel's Core i5/i7, and in some cases, Bulldozer was worse than the previous Phenom II chips. AMD really tried to take multi-threading to the extreme, and in the process, pulled a Pentium 4 where the chip was hot and power hungry.
Prescott Pentium 4's shouldn't even make the list. The Prescott Pentium 4 did run hot but if you could cool them they overclocked like crazy. To be clear, all Netburst CPU's were bad compared to their AMD counterparts at the time but they weren't bad products necessarily. Before the dual-core Athlon X2's came out, many would still argue the Pentium 4 offered a better multi-tasking experience as well as a general experience. It's really in the benchmarks where AMD kicked the piss out of them. Obviously for gaming, going AMD was a no brainer but for video editing and multitasking, Intel still had AMD beat.

Otherwise I'd agree. Bulldozer was essentially AMD's Netburst/Pentium 4. Though one could argue that until the Athlon, that's practically all AMD made compared to Intel.
 
Prescott Pentium 4's shouldn't even make the list. The Prescott Pentium 4 did run hot but if you could cool them they overclocked like crazy.
Imho Prescott was a disaster. Looks like failed core project which was only released because it was still cheaper for Intel to go with it - in this sense its similar to Bulldozer which also didn't live up to initial expectations but AMD was forced to release it.
Prescott's IPC decreased despite doubling L2 cache and transistor count. Power/heat increased despite node shrink from 130nm to 90nm
Intel fans (and all the people having OEM computers - Intel dominated OEMs) would probably be much better off if Intel just put the same Northwood core on 90nm... or with the same transistor budged put two Northwood cores!!!

To be clear, all Netburst CPU's were bad compared to their AMD counterparts at the time but they weren't bad products necessarily.
Most of the time high end Netbust systems were the fastest processors money could buy so in that sense they had their positives.
Prescott was however competing mostly with AMD K8 and itself was hardly an upgrade over Northwood so it didn't really look good for Intel.
Fastest single core Netburst is Pentium 4 Extreme Edition on Gallatin core.

Wilamette was terrible - everyone agreed on that.
Northwood was... it at least gave us 3+GHz GPUs and even if it was more like 2+GHz performance at least made imagination lit up.
By the time Prescott came out everyone was talking about insane power consumption/temperatures and at this point everyone knew its 'empty' GHz and Prescott is effectively worse than Northwood.

Before the dual-core Athlon X2's came out, many would still argue the Pentium 4 offered a better multi-tasking experience as well as a general experience.
True but only really true for Pentium 4 models with Hyper Threading.
Itself HD was really amazing at that time and even if not fully it gave preview of multi-tasking to come.
HT was however already available on Northwood and like I suggested Intel could put two Northwood cores with the same amount of transistors... if they did that instead of Prescott at the time they released Prescott then we would have dual cores this much earlier.

This makes me think even less about Prescott...

It's really in the benchmarks where AMD kicked the piss out of them. Obviously for gaming, going AMD was a no brainer but for video editing and multitasking, Intel still had AMD beat.
True but not thanks to Prescott core design specifically.
Prescott had SSE3 which some programs took advantage of but overall it wasn't really worth it.

BTW. I do not see any good product from Prescott series.
Pentium D (which is pretty much two prescotts) tests vs Athlon X2 were pretty much laughing stock had however one good CPU: Pentium D 805 - best performance/price from any dual-core until Core2Duo were released (cause they were priced well too and while more expensive were also much faster) and then especially Pentium Dual-Core (like ~$80 buck [email protected] processors like E2140)

Also AMD screwed up Athlons X2 cores in that they didn't work well in Windows XP. Lots of issues in games and need to change affinity because games could go crazy otherwise. Pentium D gave maybe slower and more heated experience but at least more stable and consistent.
 
I've never had good lock with either the K6-2 or K6-3 back then. But in fairness to them, it was probably the motherboard and shitty SiS chipsets I was trying to run them on.
The VIA Apollo MVP3 (IIRC?) was the chipset to get for Super Socket 7 (yes, that's what AMD really called it). And even then some AGP 2x cards had issues. AGP really wasn't designed for Socket7, and it shows in the crappy implementation by the non-Intel chipset outfits of the time. Only 3dfx cards worked without issue. Mostly.
 
The VIA Apollo MVP3 (IIRC?) was the chipset to get for Super Socket 7 (yes, that's what AMD really called it). And even then some AGP 2x cards had issues. AGP really wasn't designed for Socket7, and it shows in the crappy implementation by the non-Intel chipset outfits of the time. Only 3dfx cards worked without issue. Mostly.
VIA chipsets of the time had issues with AGP cards even on Slot-1/Socket 370 motherboards.
I remember trying to make GeForce 256 work on cheap VIA motherboard with Celeron and this combo didn't work well. Changing motherboard to 440BX resolved all the issues and card ran great.

3dfx cards were much more compatible because they didn't really support AGP as much as used it like 66MHz PCI. I am not even sure if they can take any advantage of AGP 2x and even if they could all 3dfx cards with AGP and PCI versions didn't show much difference between AGP and PCI at the same clocks so "66MHz PCI" makes sense.

Anyways, without need for any of the more nuanced features like AGP memory early chipsets developed before graphics cards really needed needed them 3dfx cards being one of these which do not need or even use such features can work well. That said for retro system builders - I am not sure if drivers didn't improve things. VIA for what its worth supported their chipsets for very long time and improved drivers for these chipsets even for older operating systems. It wouldn't be therefore too surprising to not find any issues with these older chipsets and even more modern GPUs.
 
VIA chipsets of the time had issues with AGP cards even on Slot-1/Socket 370 motherboards.
I remember trying to make GeForce 256 work on cheap VIA motherboard with Celeron and this combo didn't work well. Changing motherboard to 440BX resolved all the issues and card ran great.

3dfx cards were much more compatible because they didn't really support AGP as much as used it like 66MHz PCI. I am not even sure if they can take any advantage of AGP 2x and even if they could all 3dfx cards with AGP and PCI versions didn't show much difference between AGP and PCI at the same clocks so "66MHz PCI" makes sense.

Anyways, without need for any of the more nuanced features like AGP memory early chipsets developed before graphics cards really needed needed them 3dfx cards being one of these which do not need or even use such features can work well. That said for retro system builders - I am not sure if drivers didn't improve things. VIA for what its worth supported their chipsets for very long time and improved drivers for these chipsets even for older operating systems. It wouldn't be therefore too surprising to not find any issues with these older chipsets and even more modern GPUs.

Staying on top of the latest VIA Chipset driver was very much a thing for sockets all the way from Super-7 through AM2.
 
Staying on top of the latest VIA Chipset driver was very much a thing for sockets all the way from Super-7 through AM2.
Yes, indeed. Some versions frustratingly fixed major issues while creating others. I do not miss the days of countless VIA 4-in-1 driver packages until shit worked. After nForce 2 I didn't look at anything non-nVIDIA until they were forced out of the market, aside from my old Gigabyte GA-EP35-DS3R Intel P35 board. 680i/780i/790i Ultra SLI boards were just too expensive for me at the time.
 
It's really the motherboards that make or break the K6-2 and K6-3 ownership experience. To some degree that's true to this day. However, the days of multiple chipset brands and models were much more difficult on the consumer as the 3rd party chipsets were hot garbage most of the time.
Great, now I'm having PC-Chips M598/SiS 530 PTSD. What the hell was my dad thinking, building a computer around that board?

Wasn't stable at 100 MHz FSB despite having PC-100 SDRAM fitted, so he ran the CPU at a 66 MHz FSB, causing me to believe for many years that I had a K6-2 366 MHz rather than a 350 MHz one when I finally got around to removing the heatsink and the old thermal paste.

SiS 530 proudly touted AGP graphics... except you had no AGP slot, it was electrically dedicated to the crappy integrated graphics core that straight-up lost DirectDraw acceleration if you updated DirectX too far (not sure if it was 7 or 8 that did it), so it wasn't even a pleasant experience for 2D stuff.

The CPU socket was behind two of the three PCI slots, so forget about Voodoo2 SLI - the cards are too long.

Some idiot put an I/O header for one of the serial or parallel ports between the expansion slots, too, so have fun taking that off and on repeatedly when working with PCI cards!

As much as Super Socket 7 is trending nowadays for some weird reason, I do not miss that motherboard one bit, having gone off to a recycler decades ago. I'd rather build a cheap Pentium II, III or Athlon system for that era of computing - ATX form factor (still used to this day!), generally better performance, and best of all, the chipsets most likely aren't complete ass that'll drive you nuts with instability.

We sure have it easy nowadays, with a lot of the old northbridge functionality on the CPU now - memory controllers, PCIe lanes, and so forth. For the most part, that stuff just works with today's OSes.
 
The only reason I can think to the above (why said PC Chips board was selected) is price. They were the cheapest at the time (and barely functional)
 
Yes, indeed. Some versions frustratingly fixed major issues while creating others. I do not miss the days of countless VIA 4-in-1 driver packages until shit worked. After nForce 2 I didn't look at anything non-nVIDIA until they were forced out of the market, aside from my old Gigabyte GA-EP35-DS3R Intel P35 board. 680i/780i/790i Ultra SLI boards were just too expensive for me at the time.
Nvidia had fast chipsets when it came to memory-speed and general performance but they ran pretty hot and not in a good way and FSB overclocking was pretty limited compared to Intel.

I had two nForce chipsets for Intel, do not remember which but first I used with Pentium D 805 and second with Core2 E7200 and in both cases maxed out CPU frequency before FSB limit really affected me. E7200 I for a short while used with some Intel and maxed out FSB - ran marginally better than on nForce.

Anyways, it was kinda obvious that after some point in time Nvidia didn't put too much effort in to these chipsets and they all seemed like rehash of the same product and just overclocked it more and more. Zero issues on both Nvidia and Intel so with neither being better here it doesn't really matter we have one now.

As much as Super Socket 7 is trending nowadays for some weird reason, I do not miss that motherboard one bit, having gone off to a recycler decades ago. I'd rather build a cheap Pentium II, III or Athlon system for that era of computing - ATX form factor (still used to this day!), generally better performance, and best of all, the chipsets most likely aren't complete ass that'll drive you nuts with instability.
440BX board + Pentium III or even better Tualatin Cellie is the best ISA system one can build.
I guess many people build Super Socket 7 systems because they never had one?

We sure have it easy nowadays, with a lot of the old northbridge functionality on the CPU now - memory controllers, PCIe lanes, and so forth. For the most part, that stuff just works with today's OSes.
For the most part it was easy back then too except this strange period when AMD K7 didn't have nForce2 chipsets.
I mean you bought Intel + Intel if you didn't want any surprises. Whole Super Socket 7 platform and dodgy motherboards was cheap'o'fest where one needed to tread very carefully.

That said there were lots of bad mobos for Intel too. Like unfortunately popular low-end infamous BXCell Alladin ALi/ULi and other waste of sand products :)
 
Yup, most of these CPUs would at least get stuff done or play a game. Not like the AMD E1's I mentioned.
Exactly. A CPU being slow doesn't necessarily make it bad. What makes it bad is being touted as being a better performer than it is and or having serious technical flaws that create actual issues with their use. For example, the Pentium FDIV bug is a good example of that. When your CPU can't do math properly it is (and was) a problem. Bulldozer being power hungry, running hot and performing like crap (worse than Phenom II clock for clock) when AMD made promises to the contrary is another. Yes, it worked but the only thing it really had going for it is that it worked in existing AM3+ motherboards and it was cheap.
 
Yes, it worked but the only thing it really had going for it is that it worked in existing AM3+ motherboards and it was cheap.
Which still places bulldozer outside of "worst cpus ever". It served a purpose at the very least. The FX6300 was home in quite a number of very budget builds. There are cpus on the list that literally aren't stable lol.
 
Which still places bulldozer outside of "worst cpus ever". It served a purpose at the very least. The FX6300 was home in quite a number of very budget builds.
It wasn't AMD's intended purpose though. It's not like their APU's which were always intended to be low cost, low power options for laptops and small form factor or ultra-budget PC's. Bulldozer CPU's had to be priced like it was intended for the bargain bins at Wal-Mart or it wouldn't have sold to anyone except AMD zealots.
 
It wasn't AMD's intended purpose though. It's not like their APU's which were always intended to be low cost, low power options for laptops and small form factor or ultra-budget PC's. Bulldozer CPU's had to be priced like it was intended for the bargain bins at Wal-Mart or it wouldn't have sold to anyone except AMD zealots.
Changes nothing about what I said. Bulldozer does not belong in a list of cpus that aren't even capable of being a cpu due to issues. High school me was thankful I could get a FX6300 for $80. Made for a pretty decent gaming pc. I'm not even saying it was a good cpu.
 
Back
Top