Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
OP is BSing this whole thing. He has not replied, scared to come back?
Businesses have learned that in the end they make more money with this policy than they just let them go and trash the brand for weeks. Buyers like this will lie if they get mad just to damage the brand. It's just the way it is.
DO NOT DO THIS, UNDER ANY CIRCUMSTANCE.
Intel is already getting fucked unnecessarily for shipping a 2nd chip, why cost them more? Companies always pass the burden of extra cost onto consumers. You may think doing this once won't affect them, but its this type of mentality spreading across the entire customer base that costs companies money.
Besides, there is a poor guy in receiving that will have to doublecheck all those serial numbers and start scratching his head and causing him stress and loss of time as he unfucks everything you did. As someone who's been that guy, its fucking annoying.
Now I know why our prices are so damn high for these things.
Well to be fair this is one of those case where the people at Intel were probably like what this idiot really just send him a new one to make him STFU. Then they chaulk it up as a loss. Every company does this sometimes you know you just have a anal customer and you lose a little money to get it out of your hair. But at the end of the day these types of customers typically represent a very small portion of your client base and as you can see they are very vocal so you find its better to just make them happy and end it.
... The i7 line is basically "the Xeon reject line" and therefore Intel has many of them that they're trying to get rid of (especially this chip being at the very bottom of the SKU stack). ...
This thread is funny but this quote is even funnier... the i7 targets a slightly different customer than the xeon line. The xeon are targeted at blade servers and the i7 at workstations really. xeon are for data that has to be perfect or as close as possible with 100 percent uptime where the i7 are suppose to get every ounce or performance as possible for a set task. They also have registers the xeon do not and vise versa which is the biggest reason you can tell i7 are not binned xeons. The way a processor is grown, is a layer of noncondutive material is laid down and then conductive metals are laid down and essentially grow as crystalline structures into the shapes the layout needs. They use laser to cut a sculpt these at every stage of the process but mistakes happen and at the size they are working at nano meters there is no way to fix the shapes once they harden in place. Thus they can cut certain sections out which result in chips with certain parts disabled and some parts just turned off with firmware. The second due to width of the laser is too long and or wide sometimes.
This thread is funny but this quote is even funnier... the i7 targets a slightly different customer than the xeon line. The xeon are targeted at blade servers and the i7 at workstations really. xeon are for data that has to be perfect or as close as possible with 100 percent uptime where the i7 are suppose to get every ounce or performance as possible for a set task. They also have registers the xeon do not and vise versa which is the biggest reason you can tell i7 are not binned xeons. The way a processor is grown, is a layer of noncondutive material is laid down and then conductive metals are laid down and essentially grow as crystalline structures into the shapes the layout needs. They use laser to cut a sculpt these at every stage of the process but mistakes happen and at the size they are working at nano meters there is no way to fix the shapes once they harden in place. Thus they can cut certain sections out which result in chips with certain parts disabled and some parts just turned off with firmware. The second due to width of the laser is too long and or wide sometimes.
Wow. Douchebagery and entitlement is a single thread?!
And apparently envy is here as well...
Cerulean said:Can someone provide pix of AMD RMAs? I would like to be well informed before I make my CPU purchase
It is possible Intel told him not to use the chip in the bent box and to just send it back. But I agree normally this shouldn't of been an issue and would just be costing them, and in turn us, more money.You had Intel replace a perfectly fine CPU with another one because the box has damage?
Well, kudos to Intel!
I sent them pictures of how the first box came and they overnight shipped me a replacement. All I gotta do is send the first one back. They were very quick to contact me. Great service from them.
Intel doesn't have an i7 production line and a Xeon production line. LCC Xeons and i7s come off the exact same production line, with the differentiating features activated or inactivated depending on what a chip is to become. The best (lowest leakage and highest speed) chips off a wafer become Xeons to satisfy Xeon demand. The lesser chips become i7s along with possibly some Xeon-grade chips that weren't needed to be made into Xeons due to insufficient demand at the time. The MCC (medium core count) and the HCC (high core count) chips are made with great care due to their high core count and high penalty for defects. All MCC and HCC chips become either E7 or E5 Xeons.
And apparently envy is here as well...
Nahh. Pretty happy with my setup. Only reason I want an upgrade on the CPU side right now is the horrible performance of Diablo 3 due to it being a single-threaded game, and instead of particle physics being offloaded to GPU, they are done in software. With my setup @1440p and everything maxed out IN GAME, I drop down into the 20's. If I turn on V-Sync in game, which allows Crossfire to work, I drop into the 30's. It's rediculous.
Had this been for an actual malfunctioning CPU, then people would be more inclined to agree with you and more sympathetic to OP's situation.the rest of you need to stop your whining, Intel did the right thing, it's their responsibility to make sure a whole package makes it to the customer from an RMA, and if it arrives banged up, then they need to file a claim with UPS and give him another CPU, which they did. Good for them, and glad the OP got a replacement.
I would have asked for a replacement too, makes it easier to resell..
the rest of you need to stop your whining, Intel did the right thing, it's their responsibility to make sure a whole package makes it to the customer from an RMA, and if it arrives banged up, then they need to file a claim with UPS and give him another CPU, which they did. Good for them, and glad the OP got a replacement.
D3 is multithreaded. After the first few patches, it was much better threaded than it was at release.
Is your sig system the one you are using to play it with?
Have you tried manually setting the affinity of D3 to only run on full cores?
The memory controller is also pretty weak compared to Intel. That may have something to do with it although it really should work just fine as when it first came out a family member and I played it quite a bit.
At that time, he was running a low end Core2 based chip that was overclocked to ~3.5Ghz with a 4850 1GB card. Your setup should eat that system alive. Pretty sure he was running with pretty high settings at 1080p.
I was running dual 6870s at the time and had everything maxed out at 1080p. After the first few patches, it was way way way better than right at release. Never had any slowdowns with almost everything maxed out at 1080p. My CPU was an Intel 3820 running at 4.8Ghz if I remember right.
I am guessing I had shadows turned to medium as I really don't like super sharp shadows. Everything else would have been set at max.
Have you tried turning shadows down?
Is your CPU running at stock speed or is it overclocked?
Is your RAM actually running at 1866 or is it running at some default speed (1333 maybe)?
The warranty entitles the end user to a working replacement CPU. Not a new CPU in a Gem Mint 10 condition box. Most companies send out brown box refurbs for warranty replacements.
AMD's and NVidia's graphics ASICs come off the same production line. Are they also just enabled/disabled and binned chips, too?