AMD 8700B benchmars are terrible, this CPU is a crap.

In all fairness, the laptop in question (EliteBook 745 G3) is nearly as big as a Dell XPS 9550, which has a 35W TDP CPU and a 60W TDP GPU. So the "it's a 15W CPU" arguement doesn't hold much water. If I'd bought this laptop I'd return it, I know that's probably not an option because it's a company machine, but the performance pretty bad.
 
In all fairness, the laptop in question (EliteBook 745 G3) is nearly as big as a Dell XPS 9550, which has a 35W TDP CPU and a 60W TDP GPU. So the "it's a 15W CPU" arguement doesn't hold much water. If I'd bought this laptop I'd return it, I know that's probably not an option because it's a company machine, but the performance pretty bad.

What does size have to do with it? And for cripes sake you're talking about a laptop twice the price and has better cooling plus a dedicated gpu. common man....wow

Seriously, my head hurts from some of these posts.
 
I add that at home I have a dell XPS 13 with a 4300u 15w.
The dell costs as much as the elite book and that CPUs is more than twice as fast on every single test I have done.
More than twice fast.

So where are the H.265 tests with both machines :)
If AMD is so crap why worry about doing those..
 
I wonder how the OP knows how much the CPU costs HP relative to an Intel processor? I'm willing to bet the 32-bit dual core Atom that costs HP the same wouldn't exactly shine here, either....

probably some truth there. I bet Intel profit margins are a lot higher with their 14nm chips then amd at 28nm. I really want to support and defend AMD but it seems like the OEMS build complete crap around the AMD chip, ive read multiple cases were the A10-8700/8800 throttles badly. I bought an HP DV7 17" for $400 which looked a lot like a macbook pro ripoff with the first AMD llano chip and it was fantastic. I overclocked it from 1.6 to 2.3ghz and it outperformed a Q6600 in cinebench plus it played world of warcraft pretty well. Seems like since that llano chip there has been little to no progress
 
Just to give you guys some more background....last Xmas I purchased a new open box HP Envy 15 off of eBay for a great price that has the FX-8800P processor ii it with a Hybrid drive and 6GB of ram. I upgraded the memory to 16GB G.Skill Ripjaw series (2x8) and slapped a Samsung 850 EVO in it. The chip was locked at 15 watts but went up to 25 watts for the first 120 seconds before settling in around 15-18 watts during extended gameplay. I was able to play Diablo 3 on the highest settings with a gimped APU. Just think if they gave you the whole 35 watts or even marginally better cooling/hardware. OEM's are crippling APU's on purpose for more/cheaper sales and calling the gaming laptops. No it would still not been able to play the latest games on high at 35 watts but it was much faster anything Intel could give you in that price range and TDP without a standalone gpu.
 
Just to give you guys some more background....last Xmas I purchased a new open box HP Envy 15 off of eBay for a great price that has the FX-8800P processor ii it with a Hybrid drive and 6GB of ram. I upgraded the memory to 16GB G.Skill Ripjaw series (2x8) and slapped a Samsung 850 EVO in it. The chip was locked at 15 watts but went up to 25 watts for the first 120 seconds before settling in around 15-18 watts during extended gameplay. I was able to play Diablo 3 on the highest settings with a gimped APU. Just think if they gave you the whole 35 watts or even marginally better cooling/hardware. OEM's are crippling APU's on purpose for more/cheaper sales and calling the gaming laptops. No it would still not been able to play the latest games on high at 35 watts but it was much faster anything Intel could give you in that price range and TDP without a standalone gpu.

That is something I expected from it and it is good to hear you can go there because Diablo 3 can be very picky on performance and the best I could do on my Kabini based laptop was low (even resolution had to come down to 1024*768) without it becoming a slide show. Was it the ram that pushed performance?
If only those companies had a bit better understanding of how to build better laptops with these kinds of chips. Can't imagine that the total cost for any laptop with a FX 8800p would sky rocket when you allow closer to 30 Watt performance.
Then again if you look at the desktop that Dell sold with the FX 8800p then you stop asking questions all together. I'll say it again that those people are screwing themselves with the approach they take on AMD laptops. And there still complaining about PC market being in a slump ...
 
Is it because they want the Ultrabook certification ? battery life and slimness ?
 
Is it because they want the Ultrabook certification ? battery life and slimness ?
Looking at topic starter I'm sure there is a fetish that it needs to run cinebench because that is what you do with your laptop/netbook.

I'm sure that the industry now already noticed that meaningless words as "ultrabook" does not mean anything to the consumer maybe they can add some other few additives in front of it as MEGA or GIGA, maybe that will work Mega Giga Ultrabook ...
 
That is something I expected from it and it is good to hear you can go there because Diablo 3 can be very picky on performance and the best I could do on my Kabini based laptop was low (even resolution had to come down to 1024*768) without it becoming a slide show. Was it the ram that pushed performance?
If only those companies had a bit better understanding of how to build better laptops with these kinds of chips. Can't imagine that the total cost for any laptop with a FX 8800p would sky rocket when you allow closer to 30 Watt performance.
Then again if you look at the desktop that Dell sold with the FX 8800p then you stop asking questions all together. I'll say it again that those people are screwing themselves with the approach they take on AMD laptops. And there still complaining about PC market being in a slump ...

Every little thing you can upgrade in a laptop pushes performance just that much extra. Most people don't realize that 15 watt cpu's were stuck at most with 1600MHz ram due to the limited power until recently with Skylake. Plus OEM's gimped laptops by adding 1x8GB stick of ram instead of 2x4GB for dual channel. 15 watt average and stuck with one stick of 1600MHz CAS11 ram and you can see how gimped they made those abominations. I added 2x8GB of CAS9 ram and an SSD and its like a brand new laptop. If they could have just added a little better cooling that would allow the APU's to stay at a higher TDP (wattage) without changing anything. I'm no expert but I did a lot of research before picking one up as the FX-8800P looked really good at the time for a light gaming laptop. I wasn't going to pick one up after the research but found a screaming deal on eBay to play around with.
 
Looking at topic starter I'm sure there is a fetish that it needs to run cinebench because that is what you do with your laptop/netbook.

I'm sure that the industry now already noticed that meaningless words as "ultrabook" does not mean anything to the consumer maybe they can add some other few additives in front of it as MEGA or GIGA, maybe that will work Mega Giga Ultrabook ...

I see what your saying, but you stop into best buy -- its clear one trend is hot right now and thats "thin" 15 watts is easier to cool then 35 watts. I regularly hear people complain that their new shiny laptop does not have an optical drive. It was sacrificed for size and weight. What i dont understand why isnt the 17" notebooks get the higher wattage chips? I had a gateway FX back in the day, swapped an intel extreme processor into it and ran it at 3.4ghz. It was a 25watt P8600 i replaced with a 65watt and it ran great. though it did throttle sometimes without a cooling pad under it.
 
I'd say thin/light + big battery is the best thing to ever happen to laptops. Stuff like the Dell XPS with 0 Bezel is just icing on the cake - they look so damn good.
Now only if someone made a power efficient chip with a half decent GPU that can play dota @ 1080p60 - it'd be nothing short of a miracle
 
I can see how the lower-end AMD CPU performed as poorly as it did in Cinebench:

Cinebench makes heavy use of the SSE 4.x instruction set, which AMD CPUs (yes, even current power-hungry desktop CPUs) have historically been very poor at handling compared to Intel's low-end CPUs.
 
I still don't see how the Cinebench score is so bad. How did these guys get 194 with the same APU?
AMD A-Series A10-8700P (Carrizo) Notebook Processor Specifications and Benchmarks
Still not a great score, I get it, but OP got only 112? That's just over half what someone else got; not within margin of error at all.
How much bloatware/other software was running in the background?

The point has been moot for god knows how long these muppets on the forums keep insisting running single thread benchmarks on an architecture which struggles on it. Might as well run x87 code and bitch about that as well..

When you tell them an interesting benchmark or real life use HEVC based movies or clips which run really well on Carrizo you hear crickets
 
I can see how the lower-end AMD CPU performed as poorly as it did in Cinebench:

Cinebench makes heavy use of the SSE 4.x instruction set, which AMD CPUs (yes, even current power-hungry desktop CPUs) have historically been very poor at handling compared to Intel's low-end CPUs.

I was referring to the sub-200 CPU score in Cinebench 15. For comparison, my i5-4210U laptop achieves a CPU score of 236 in Cinebench while my kminiITX i3-6100 breadbox achieved a CPU score of 401 there.

On the other hand, sblantipodi, were those benchmarks run with that laptop on battery power? If so, that's the cause: When a laptop is running on battery power, the CPU clock speed gets throttled back no matter what. The benchmarks should be run when the laptop is plugged into the AC mains via a power brick, with all power saving features disabled, for the most accurate results. If on the other hand those particularly bad results were when plugged in, then there are too many running processes going on.
 
good point! was the unit decrapified and optimized first. anybody that has used any oem laptop knows they come preloaded with a tonne of shit. I bet it has a trial of Norton or mcafee running on it...
 
So I don't follow benchmarks much at all, because they're pretty much pointless at showing anything other than a number. What exactly keeps Cinebench unbiased. I mean, we can't keep congress and senators from being paid off, what keeps Cinebench from getting a few bucks from some processor manufacters? Do they have open books? How do they make money?

I'm not accusing them of anything, but why would I trust some little company to basically grade 3 or more huge companies with unbiased results?

All I know is that my i5 2500 Sandy Bridge that is supposed to be ridiculously faster than anything AMD offered at the time is on average about 10% slower than my Phenom II... Granted, the Phenom II is 125W and I think the i5 is 95W, but still... I paid $150 for the PhII new 6 years ago... I paid $120 for the i5 used (chip only) on ebay... last year...

Tell me about bangs and bucks again...
 
It amazes (it really shouldn't) that people have nothing better to do than go around on the forums and shit all over AMD. AMD is a smaller company with a smaller budget, they do that they can with what they have.

If people really want AMD gone then have fun paying some ungodly cost for Intel CPUs when there's no competition.
 
It amazes (it really shouldn't) that people have nothing better to do than go around on the forums and shit all over AMD. AMD is a smaller company with a smaller budget, they do that they can with what they have.

If people really want AMD gone then have fun paying some ungodly cost for Intel CPUs when there's no competition.

There's no competition whatsoever as of today, AMD going out will make no difference in competition.
And they were always a limited resources company which made good products that competed in the market before the current-gen. People bitch because it's a waste of someones money now under the false illusion of price/performance.
 
What does size have to do with it? And for cripes sake you're talking about a laptop twice the price and has better cooling plus a dedicated gpu. common man....wow

Seriously, my head hurts from some of these posts.

Size vs performance is the basic metric towards which all notebook performance is measured. You trade performance for size and at the size level of this notebook, you can do a hell of a lot better with a similar size system. The HP EliteBook 745 G3 has an MSRP around $1000, as does the XPS 15. It's not a cheap laptop. Maybe your head would hurt less if you had any idea what we were talking about.
 
There's no competition whatsoever as of today, AMD going out will make no difference in competition.
And they were always a limited resources company which made good products that competed in the market before the current-gen. People bitch because it's a waste of someones money now under the false illusion of price/performance.

So where are those HEVC benchmarks Carrizo falls short on compared to current day Intel offerings oh sorry last gen Intel offerings , keep it fair right ?
 
Size vs performance is the basic metric towards which all notebook performance is measured. You trade performance for size and at the size level of this notebook, you can do a hell of a lot better with a similar size system. The HP EliteBook 745 G3 has an MSRP around $1000, as does the XPS 15. It's not a cheap laptop. Maybe your head would hurt less if you had any idea what we were talking about.

LOL....you can do better performance at any size laptop. Maybe you should try reading all of my posts in this thread before throwing around stupid statements. Size don't mean crap anymore. I have a 12" Surface Pro 4 that could run around most 14" laptops...big deal. Congratulations you're giving me another headache. :shifty:
 
So where are those HEVC benchmarks Carrizo falls short on compared to current day Intel offerings oh sorry last gen Intel offerings , keep it fair right ?

Never seen any, even if that thing is faster 10x in 4k HEVC video decode, it makes hardly a selling point.

If it was good, it would've made money for the company, its always that simple
 
Never seen any, even if that thing is faster 10x in 4k HEVC video decode, it makes hardly a selling point.
If it was good, it would've made money for the company, its always that simple
And no, Intel does just not do a very good job at it or not at all.
There's no competition whatsoever as of today, AMD going out will make no difference in competition.
And they were always a limited resources company which made good products that competed in the market before the current-gen. People bitch because it's a waste of someones money now under the false illusion of price/performance.

Carrizo is the cpu that has hardware HEVC support 1 generation ahead of Intel counterpart. And you write that AMD can not compete unless it is something from previous generation.

So prove your point show previous generation cpu from Intel which is the Carrizo counterpart and run those useless benchmarks on x.265 decode
 
And no, Intel does just not do a very good job at it or not at all.


Carrizo is the cpu that has hardware HEVC support 1 generation ahead of Intel counterpart. And you write that AMD can not compete unless it is something from previous generation.

So prove your point show previous generation cpu from Intel which is the Carrizo counterpart and run those useless benchmarks on x.265 decode

I think until haswell until doesn't have that feature at all, don't know about current gen. Doesn't make the chip faster/efficient though by adding a 4k hardware decoder for a laptop which will be a budget offering with a 1080p screen in the best case scenario....

Anyways its a big win if we have budget 4k laptops.

On the other hand, better single (or multi) core perf/watt in an ultrabook at the cost of shittier iGPU makes sense as apps work faster, outlook loads faster etc. etc.
 
Why would anyone do 3d Rendering on a 15w cpu :LOL:? Anyways if the laptop does what it is meant to do at a lower cost - duh - that is the one to get. I guess one could make a tread comparing a game benchmark where this HP would smoke a similar Intel one. Then conclude Intel is junk! Or Video encoding instead with same rant. In other words benchmarking a device on something it will never do seems rather pointless.
 
Why would anyone do 3d Rendering on a 15w cpu :LOL:? Anyways if the laptop does what it is meant to do at a lower cost - duh - that is the one to get. I guess one could make a tread comparing a game benchmark where this HP would smoke a similar Intel one. Then conclude Intel is junk! Or Video encoding instead with same rant. In other words benchmarking a device on something it will never do seems rather pointless.

this CPU is only good for web browsing, so why someone would spent a thousand of dollars for web browsing?
 
Neither here nor there, but I've had a few AMD CPUs. The only good one was the Athlon Barton 2500+. After that, all Intel.
 
Run it with hwInfo64 and graph the cpu frequency while it runs... i am curious what the frequency looks like.

Also you might set the thing to high performance mode while running these benchmarks (seems obvious but was it set to that?)

At the opposite end from you... i have a problem with FryBench in that it is bugged and only recognizes 32 cores :( So its throwing away 16 cores worth on my rig.

Cinebench runs fine for AMD at least on the K10s... lately i've been running 16 cores (out of 48) at 3.5/3.6 GHz... scores 15.8-15.9...

Its just crap with BD architecture AMD cpus. Excavator finally got back up to K10 levels (4 excavator cores are now about equal to 4 K10s at same frequency).


My computer company contains an AMD Carrizo A10 8700B CPU.
This CPU is a complete crap.

It scores 112 points in Cinebench and it took 23 minutes to finish the first run of FryRender.
An old Haswell-U have twice the performance of this crap.

How can AMD sell this CPU at this price?

Talking about HP EliteBook, 1920x1080 14 inch display, 512GB M.2 SSD, 16GB RAM.

fryrender.png


cinebench.png


this thing is a smartphone, with the same price you can get a CPU from intel that it has double the performance.
 
Two months ago I went from a 4.4ghz 3770k to a vishera 8350 ( gave the 3770k to a fmaily member) I also run a watercooled oced 290 @ 2560x1440. The typical games I play are gw2, wow, overwatch, rainbow 6 siege, payday2.

The only game I notice a big difference in fps is gw2.

I noticed my web browsing experience is better with the amd vs intel. I am not sure this is a pro for amd but more a con for HT.

I would have bought a 5820k to replace my 3770k but I wanted to see for myself if all the negativity was true about AMD. I feel it is true in certain circumstances but I also feel it is blown out of proportion with benchmarks. wtf plays at 1024x768? Does it really matter once you are above 120fps in that situation?

This 8350 is a stop gap till the new chips show up. If XEN is good I will move to XEN , if XEN is not good I will grab an intel hexcore.
 
Two months ago I went from a 4.4ghz 3770k to a vishera 8350 ( gave the 3770k to a fmaily member) I also run a watercooled oced 290 @ 2560x1440. The typical games I play are gw2, wow, overwatch, rainbow 6 siege, payday2.

The only game I notice a big difference in fps is gw2.

I noticed my web browsing experience is better with the amd vs intel. I am not sure this is a pro for amd but more a con for HT.

I would have bought a 5820k to replace my 3770k but I wanted to see for myself if all the negativity was true about AMD. I feel it is true in certain circumstances but I also feel it is blown out of proportion with benchmarks. wtf plays at 1024x768? Does it really matter once you are above 120fps in that situation?

This 8350 is a stop gap till the new chips show up. If XEN is good I will move to XEN , if XEN is not good I will grab an intel hexcore.

I would really not believe that going from Intel to AMD in WoW (heavily favors single fast core over multicore) didn't have a humongous dip in minimum fps (like in Ashran for eg.) or do you play with less than maximum settings + AA and it is always locked at 60 anyways>?

Ashran/ Warspear Camp should be stuttery into the 20 - 40s with a slow core based on my assumptions.
 
I would really not believe that going from Intel to AMD in WoW (heavily favors single fast core over multicore) didn't have a humongous dip in minimum fps (like in Ashran for eg.) or do you play with less than maximum settings + AA and it is always locked at 60 anyways>?

Ashran/ Warspear Camp should be stuttery into the 20 - 40s with a slow core based on my assumptions.


I didnt say it wasnt lower I said it was not noticeable.

30-50fps in that area on the Intel. Not a noticeable difference.
 
I didnt say it wasnt lower I said it was not noticeable.

30-50fps in that area on the Intel. Not a noticeable difference.

Wow, i didnt think it'd be that low on an i7, i will have to renew my subscription and test it on my 4.7ghz 6400 and see if there's any improvement, looks like it is just really bad optimization. Last i remember it was same 30- 40 dips for me on an i3 while rest of the game never went below 60 ever
 
Wow, i didnt think it'd be that low on an i7, i will have to renew my subscription and test it on my 4.7ghz 6400 and see if there's any improvement, looks like it is just really bad optimization. Last i remember it was same 30- 40 dips for me on an i3 while rest of the game never went below 60 ever

The game only uses 3 cores. If I recall the break out is interface/graphics/audio. With audio easily able to double up on another core using almost nothing. Whatever you saw on the corei3 should be the same on any other cpu.
 
Back
Top