AMD, thank you for AM3+

c3k

2[H]4U
Joined
Sep 8, 2007
Messages
2,330
I built a "place-holder" computer for the wife about two years ago. I used an Asus microATX AM3+ mobo to run a spare Phenom II 720BE I had. (It was spare because I bought a 1090T to go in its original mobo in a different build.) Nice little chip, that 720. I could unlock the 4th core OR OC it ~3.0 Ghz (My remembrance may be off on that speed).

Anyway, I purchased an 8350, flashed the BIOS, tossed the new cpu in and, voila, I now have flaming fast 8-core machine for my wife. No mobo upgrade needed.

One processor, two machines, each upgraded, one to a newer generation of cpu. Sure, an intel i3770k would probably beat that 8350, but I didn't need to buy a mobo, memory or spend twice the price. It cost ~$170. (I figure the i3770k + mobo would've run ~$550+.)

So, thank you AMD for thinking of the consumer.

Ken
 
Yes, and it will be current through 2014. Vere naice.

AM2+ as well... I saw my AM2+ Gigabyte 770 could support a Phenom II x6 with DDR2 :eek:
 
This is sort of like saying the cars in Cuba are current as of the 1950's. AMD wasn't thinking of your happiness when they did this, they were thinking of their wallet and how they can't afford to spend money developing a new desktop socket (they've talked about it numerous times). This is not "good", no matter how nice it is for the end user to not have to change their board for a few years. All it means is Intel can continue doing Whatever It Wants™ mostly unchallenged. I for one would have loved to see a return to AMD of the past, but it seems like APU is they way they want to go now.
 
This is sort of like saying the cars in Cuba are current as of the 1950's. AMD wasn't thinking of your happiness when they did this, they were thinking of their wallet and how they can't afford to spend money developing a new desktop socket (they've talked about it numerous times). This is not "good", no matter how nice it is for the end user to not have to change their board for a few years. All it means is Intel can continue doing Whatever It Wants™ mostly unchallenged. I for one would have loved to see a return to AMD of the past, but it seems like APU is they way they want to go now.

He just said he is (HAPPY) with his purchase and you come in here and knock him down.
Nice job bro, go back and pat your back in the Intel forums.

Glad to see your wife is liking the 8350, nice chip.
 
He just said he is (HAPPY) with his purchase and you come in here and knock him down.
Nice job bro, go back and pat your back in the Intel forums.

Glad to see your wife is liking the 8350, nice chip.

If anything he spoke the truth!!!

If he was trying to be a "fanboi" say something like: "Should of gotten 3570k and a $100 mobo and OC the shit out of it, and save gazillions on power bills..." :rolleyes:
 
^ Fanboi much?

No? I've just recently got rid of my FX 8150 and CHV setup which was my gaming rig while my Intel machines were purely for folding. The only reason I didn't just grab an 8350 and save my money was that I wanted a rig that can fold 20 hours a day and make good use of the power bill.

I stated nothing but reality. AMD claims its dropping out of the high-end desktop market, Intel claims the next gen will be BGA chips soldered to the motherboard and delays Haswell, if I hate anyone its the big blue giant that will force whatever they want on us at this point. Competition is what keeps the market from stagnating. That doesn't mean I'll choose to use a product that doesn't work as well for what I want though.

The fact that he likes the chip and its performance is great, but don't mistake AM3+ lingering for such a long time for AMD trying to do you a favor was my point.
 
If anything he spoke the truth!!!

If he was trying to be a "fanboi" say something like: "Should of gotten 3570k and a $100 mobo and OC the shit out of it, and save gazillions on power bills..." :rolleyes:

Haha , nice Tek Syndicate quote there. ;)
 
I think the lack of CPU power amd has today is because of am3. How old is that socket now? Do you think there could have more improvements if amd went with a new socket? Just think if intel was still using s775 with sandy/ivy bridge. They would of had to leave things out to assure backwards comparability. I'm use amd did the same with bd.
 
It has been said that Steamroller will also be AM3+. I wonder which, if any, current boards will be able to support it with just a bios update. Now that would be nice.
 
In other news, you can also upgrade a crappy H61 chipset motherboard with a Celeron G530 you bought 1.5 years ago to a beefy 3770k (or anything else you desire). Also, water is still wet last time I checked.

Both companies offer you an assortment of processors to choose from ranging from sub $50 all the way up to $200+, along with limited upgrade options between generations.

I'll grant that having 2 refreshes available on AM3+ is impressive, but AMD is not doing it to put smiles on your faces. AMD is doing it to control costs on a dying platform. It is a happy coincidence.

AMD is not providing for such compatibility on their platforms that still sell (FM1 -> FM2 -> FM3).
 
Last edited:
Lots of Intel folks in here in need of a Waaaambulance. (I just called, it is on it's way.) To the OP, congrats, I think it is fantastic as well. To intel users, I would say a lot of those additional pins are for triple and quad channel ram. (Although not all and not on all processors either.)

I am looking forward to the steamroller processor as well but I am content with my FX8120 right now. (It is better than the 1090t I had before it and I just dropped it right into my 990FX board.)
 
The AMD sockets in general have been really nice with backwards compatibliity but in reality suck due to vendor implementations.

Example: My 890FX board was sold as AM3. Vendor then marketed it as AM3+ and release bios for 95w bulldozers (even though it supports 140w cpus!!!) FX-4100, 6100 and 8120. Supposedly it does NOT support 4170/62xx/61xx/83xx/63xx/43xx. "Chipset incompatible" but they did make another 890FX board that supports all of them.

I've found this type of stupidity in multiple board manufacturers (asus still has "beta" bios for my board for 81xx !!!)
 
I think AMD have always weighed the pros and cons -- im sure they have their own best interest at heart, but changing the socket is expensive and what gains are there? They were late to adopt DDR2 and DDR3 because there was very little performance to be gained. Even Intel's quad channel shows very little benefit in the benchmarks ive seen...

I am really curious what enhancements a new socket would really net them, probably less then you think.
 
I love their socket strategy too. I went from a Athlon X2 7750 to a X3 435 to a X2 555 (unlocked to quad) to a X6 1090 over the course of about 3 years on the same motherboard. That is great when you're a broke ass gamer/overclocker.
 
I think AMD have always weighed the pros and cons -- im sure they have their own best interest at heart, but changing the socket is expensive and what gains are there? They were late to adopt DDR2 and DDR3 because there was very little performance to be gained. Even Intel's quad channel shows very little benefit in the benchmarks ive seen...

I am really curious what enhancements a new socket would really net them, probably less then you think.

For one thing, an entirely new chipset design without having to worry about older processor support. Intel was able to do away with their FSB entirely after 775.

I just like how anyone who points out flaws in maintaining an older standard is considered an Intel fanboy... like that means anything?
 
Last edited:
I don't think AM3 is having problems with modern games and productivity. Vishera can even run multiple VMs and multi-task well while gaming/streaming.

AMD isn't better or worse. They have made Black Edition ubiquitous, that's a plus. There likely isn't going to be a Core i3-3390K true quad-core that overclocks well for less than $120 on 1155 or new sockets. That would probably crush AMD.
 
Im on mobile so I cant post the info... but there is a nice chart showing Intel vs Amd sockets over time.

To my amazement, since the early 90's amd has in fact released MORE sockets than Intel! I verified the data!

If a socket change nets me a 20%+ performance increase right out of the gate, I am all for it. The only time I really complain is with RAM generation changes. If AMD had the cash to develop new features into a new more efficient architecture (read socket change) we would have already seen changes.

Riddle me this... same socket.. 1090t... 1100t... excellent cpu's that keep up in most tests with the 8320 and outperforms the 8120... ever think because they are trying to squeeze what they can out of a dying socket?

Thats like current Intel i7 3770's or i5 3570's being minutely marginally better than a q9500. When in fact they are a very substantial upgrade in effeciency and raw power. Every step up the intel ladder since conroe days has been substantially positive, which is exactly why so many more people run intel.

AMD has even lost headway in the low end market. Do a search for AMD vs i3 cheap gaming rig. You might be very surprised... so I think the only people keep in g AMD alive now are mid range gamers and oem business.

The very high end beli ngs to intel no doubt. The budget end arguably steals more and more gamers by the day.... oh and let me mention I ran AMD until Conroe released (c2d series). Since then I saved my broke ass gamer pennies for higher end intel gear.
 
Well if you go that far back and count the dark days of Pentium 4, or Phenom I for AMD... Spider, lol. Some heads probably rolled.

It would be pretty bad if the 1100T were now obsolete, slightly older than the i5-2500k and i7-2600k. And it won't OC like Vishera.

Low end is probably best with the new Pentiums over i3 and squeeze in a 7850. And that's strange to say.
 
Probably an impossible question to really answer, but I'm wondering if eventually the i3770k would eventually become the cheaper option due to power consumption, would probably take years!
 
Probably an impossible question to really answer, but I'm wondering if eventually the i3770k would eventually become the cheaper option due to power consumption, would probably take years!

Let's say it costs you $0.10/kWh.

Let's say you're running a data crunching farm. 100% utilization for 24/7.

The FX-8350 draws 213W (bit-tech).

The 3770K draws 166W (bit-tech).

From a retail price difference, it's about $80. The difference is 47 W.

80 dollars * (1000 W*hr / .1 dollars) * (1 hr / 47 W*hr) = 17000 hours, or 709 days of 24/7 utilization.

However, in terms of processing power (going off Passmark), the i7-3770K scores 9600 while the FX-8350 scores around 9200. That's a bonus of around 4.3%... which means you need around 680 days of 24/7 passmark crunching to make the 3770K more worthwhile than the 8350, in terms of passmark and cost.

Now, if we use Sysmark 2012, the i7-3770K scores a 228 while the 8350 scores a 176. That's a bonus of 1.29. Divide 709 days by 1.29, and you get around 550 days.
 
You'd also have to factor in heat and the cost of getting a slightly beefier power supply - assuming you'd buy the lower power PSU for the 3770K. It just gets real messy and is an incredibly loaded question and scenario. What it boils down to is something that gets tossed around a bit but never mentioned with the amount of merit it deserves: idle power is much more important than full and even partial load figures. A tiny efficiency increase at idle would surmount a high watt discrepancy at full load between two chips due to most computing being done at idle. Unless you're running F@H and Bitcoin mining, the full load power consumption isn't as big a concern as people make it out to be when it comes to energy usage and your power bill.

As to the topic, I have to say I'd much rather see AMD dump their socket compatibility every two years rather than stretch them out this long. Having to adhere to certain pin layouts and chipsets means that their progress gets limited.

A prime example is PCIE 3.0. AMD will see a transition to PCIE 3.0 on their APU socket before they will on their AM3+ desktop/enthusiast line because of their reluctance to dump sockets in the server. Part of the selling point for new Opterons is that customers will be able to drop-in upgrade their machines over a period of years. As a result, the AM3+ chipsets, pin layouts and processors are held back by AMD's approach in the server (AM3+ chips are actually server chips)

It's gotten so bad that when AMD released their GCN 7970 GPU, their slides detailing the performance and benchmarks were done on Intel's X79 2011 platform because AMD still doesn't support PCIE3.0. They have done better with SATA3 and USB3.0, but simply adding more SATA3 and USB3.0 slots to a PCB isn't going to be the best approach here. I'd much rather see stuff like this coming out of AMD than just adding on to the stuff that's already there. When it comes to chipsets both Intel and AMD are utterly lazy, but that's the reality and difficulty of dealing with chipsets :/
 
Last edited:
Toss in the cost of a new mobo. Now how's that 1 1/2 to 2 year payoff looking? More like 4-5 years?

Seriously, not a fanboi here: I've got 4 machines running and they're split 50/50 AMD/intel. I like the stability of the AMD socket AM3+ over the last few years.

Happy me.

Continue counting pennies and tell me why I should be sad. ;)
 
Well here's a nice video on power usage. Skip to 9:00 mark.

http://www.youtube.com/watch?v=4et7kDGSRfc

Wasn't he the guy who a few weeks ago did random gaming tests with an overclocked AMD CPU vs a non-overclocked Intel CPU and then claimed the AMD was faster?

And again though, the video shows that the cost savings from the AMD processor is a wash due to the power, even more so if you overclock as is custom for people on this forum. So if the Intel overclocks better and performs better for most people's use what is the point in the AMD? You're not saving any money. You're just buying an inferior product.. unless of course you are doing large scale video editing or something the AMD excels at.
 
Wasn't he the guy who a few weeks ago did random gaming tests with an overclocked AMD CPU vs a non-overclocked Intel CPU and then claimed the AMD was faster?

And again though, the video shows that the cost savings from the AMD processor is a wash due to the power, even more so if you overclock as is custom for people on this forum. So if the Intel overclocks better and performs better for most people's use what is the point in the AMD? You're not saving any money. You're just buying an inferior product.. unless of course you are doing large scale video editing or something the AMD excels at.

Looks like you got lost on the way to another part of the forum.
 
Wasn't he the guy who a few weeks ago did random gaming tests with an overclocked AMD CPU vs a non-overclocked Intel CPU and then claimed the AMD was faster?

And again though, the video shows that the cost savings from the AMD processor is a wash due to the power, even more so if you overclock as is custom for people on this forum. So if the Intel overclocks better and performs better for most people's use what is the point in the AMD? You're not saving any money. You're just buying an inferior product.. unless of course you are doing large scale video editing or something the AMD excels at.

It is the same guy but I watched it and he didnt say any cpu was overclcocked.

http://teksyndicate.com/videos/amd-...s-3820-gaming-and-xsplit-streaming-benchmarks
 
. <- topic - - - - - - - - we are here -> .

lol, all it takes is one hurt butt

The backward compatibility is pretty nice to see for people that want to upgrade, for me unfortunately it doesn't go the other way (AM3 taking Piledriver CPUs). I've demoted my X3 720 machine to ESXi test lab (formerly gaming machine, replaced by the 2600K in my sig). I *want* to upgrade it, but the farthest I'm going is Thuban I guess =/
 
Wow I've yet to see one of these threads that doesn't become some sort of fanboi flame war. This is now my second Intel build after 10 years or so of AMD. Both sides have their advantages. If I was on single monitor I would still be using an AMD processor.

Us PC gamers need to stick together regardless. The console hoard is on the horizon looking to eat us alive with their ridiculous notions of how "hard" it is to drive a 60" tv when all it is is 1080p, how "leet" they are when they don't even realize where "leet" f'n came from.

Build PCs, game on them, enjoy them, band together and support each other.
 
^This^

.... and I will add that I consider the interoperability of various processors from AM2+ to AM3 as one of my favorite aspects of the company's lineups. I could be wrong, but I recall Intel forcing several board changes in the same time frame, if you were staying "current". It seems less so, now, but back then AMD boards had more features and better tweaking options at same or lower price points.

Presently, I just buy AMD out of habit, and assurance that AM3+ will last me 2 or more years.
 
It is the same guy but I watched it and he didnt say any cpu was overclcocked.

http://teksyndicate.com/videos/amd-...s-3820-gaming-and-xsplit-streaming-benchmarks

It has been debunked all over the internet. The chinstrap tool is a shill.
http://www.overclock.net/t/1348126/fx-8350-better-than-intel/40#post_19023333
Okay so there are a lot of things that are off in the video:

Using a 7870 to compare gaming performance on high end CPUs
3820 clearly losing to 3770K in crysis 2 despite being clocked higher (1080p)
He doesn't state if he's using any built in function to benchmark the games or just playing for a set amount of time
Crysis warhead results are all over the place even though the game is 2 threaded. the i5 and the 3820 had horrible fps, the difference can't be that big
warhead streaming he says a number for the i5 but on the screen there's a completely different number
Warhead streaming tests didn't include the 3770K at all
everything including ridiculous levels of AA cranked up to the max on a 7870
3770K losing in black mesa makes no sense since the 3820 does just nicely
no info on the test sequence for metro
trine 2 numbers are just plain off, it's not a hard game to run, even for the weird gfx card choice
different memory for different setups, was slower for intel
He talks about overclocking the FX8350 to nearly 5ghz but says nothing about overclocking the intel parts
incomplete info about the test setups

in essence he's saying that with a 7870 games like metro 2033 will get double the fps with an FX, which is nuts.
 
Well it seems that he made some mistakes reading numbers in his video.
There's nothing wrong with using a 7870 as it is recommended even by [H] as a decent video card and it's something that the average consumer can afford. Now I guess that you could argue that AMD is slipping some secret sauce into their cards to run better on AMD cpus.
I was always told that one of the features of Intel systems was that faster memory really doesn't give you better performance. But I wish he had done the testing with equal memory to keep the hounds away.
I don't play Trine 2 so no idea there.
He explained that all systems were running stock speeds. Read the website.
No idea what's up with the 3820 as I haven't run Intel since my Q6600 1.5 years ago.

I think that people get upset if what their perception of reality isn't what is shown to be true. I can tell you that the difference while streaming a game compared to just playing the game isn't that big of a difference. I stream at 720p as some of my buddies don't have the internet speed for 1080p. So my streaming software is converting 1080p to 720p on the fly at all times (no hard drive backups). Plus I'm playing the game in 1080p naturally.

The difference in fps with all settings on max in games is 5 - 10 fps. When my Intel 2700K buddy started streaming he had to drop his AA and other things to bring his frame rate back up to what he deemed acceptable. The exceptions to this are BF3 where I had to drop my AA from Ultra to whatever is the next step down and Project Cars when I enabled rain but that game is in alpha stages of development.

Now my buddy with the 2700K recently moved to the sticks in VA so unfortunately I can't compare streaming numbers as the internet out there is crap. But from what testing we did I could tell that the main difference between our systems streaming was that he had to lower his settings more than I did. In the end it didn't matter as both of our streams looked great and our streams were flawless. So we called it a tie. We tested this over a 3 day period and sank about 8 hours trying different things to eek a few more fps out of our systems.

If I were to start streaming and needed a system I'd spring for the 3930K as it's multithreaded capabilities far exceed the 3770K and FX-8350 as explained here. Skip to 7:50.
 
Last edited:
I know intel is better but I can't ever justify their motherboard upgrade process. Look at 775 and how many mobo chipsets there were or 478. I will keep buying amd due to the support of A, AM2, AM3+. I've had 7, first socket A ever, 754, am2+, am3+, and will keep buying as long as I can.

I might get their next amd apu for a matx/mitx build with a videocard.
 
Now I guess that you could argue that AMD is slipping some secret sauce into their cards to run better on AMD cpus.

Unfortunately for AMD, they aren't. There's a reason why AMD's GPU division uses Intel CPUs to benchmark their cards.

That 8350/3770K video is worthless.. I don't know of any reputable source that has ever backed up those results.
 
I have 2 different AM3 boards by 2 different manufacturers that support AM3+ processors, officially, but only bd x1xx (4100/6100/8120)

In theory, you can use an am3+ processor in an am3 board, and an am3 processor in an am2/am2+ board, but in reality, the board manufacturers stop bios updates after 6 mo to a year when NEW SHINY MODEL comes out and you are poked.

Call me disillusioned, but my next build is going to be intel and it has nothing to do with speed...
 
Wow I've yet to see one of these threads that doesn't become some sort of fanboi flame war. This is now my second Intel build after 10 years or so of AMD. Both sides have their advantages. If I was on single monitor I would still be using an AMD processor.

Us PC gamers need to stick together regardless. The console hoard is on the horizon looking to eat us alive with their ridiculous notions of how "hard" it is to drive a 60" tv when all it is is 1080p, how "leet" they are when they don't even realize where "leet" f'n came from.

Build PCs, game on them, enjoy them, band together and support each other.

Except it's a little different perspective from PC/console gamers. I game on the on the PC and the Xbox 360... I can tell you from my experience having both that I wouldn't want to give up either. You simply can't match the PC in graphics and in-game playability. Sorry, but analog sticks don't compare to a mouse/keyboard. On the flip side... I have a buddy that games with me and the amount of time we've dumped into getting a PC game to work online and hook up with each other has been ridiculous. I remember a few two to four hour nights just trying to get something to work. While if we wanted, we could have just hopped on the Xbox, connected to Xbox Live and joined up to play. Three minutes tops if we were slow.

Generally I've noticed games made for both PC and console, that the PC games are a bit more thorough or developed. Generally a little prettier, and if your machine is up to spec typically "play" a little better. On the flip side of that, there are far more console games made just for the console than the both or just the PC. So for some, you have no choice but to play on the console.

Oh, not to mention the difference in cost. My 7970 on sale at Amazon, plus rebate after the discount was as much as an Xbox 360.
 
Back
Top