2nd Tier Makers Unlikely To Ship Fermi Cards Until April

28nm fermi - I haven't laughed that hard in awhile. We may see a 28nm ATI GPU later this year, but no way nvidia will shrink the die by the end of the year. I still believe they'll be lucky to get 40nm Fermi to market by Q3 in ample quantities.

You still have to admit, if anything needs a 28nm die-shrink, it's the Fermi. High costs, power consumption, heat, and even clock speeds are things it appears to be struggling with at 40nm. Even though it most likely won't arrive any time in 2010, it would be in NVIDIA's best interest move as quickly as possible to release a 28nm Fermi in 2011. If NVIDIA can actually pull off shrinking Fermi to 28nm, is a whole different question.
 
Sure fermi needs a die shrink, but apparently 28nm will be harder to work out the kinks so if they can't get it right in 40nm what makes anyone think they can pull off 28nm?
 
NVIDIA pretty much hit worse case scenario on their initial 40nm yields.

Can they really do any worse with 28nm? What's next, 0% 28nm yields and NVIDIA scrapping Fermi completely? Do you really think NVIDIA engineers are so incompetent that such a thing is possible? As I see it, once you've hit rock bottom like NVIDIA has, you can only improve from there. Even if they once again hit worse case scenario on their initial 28nm yields, they still would likely be in better shape than they were with initial 40nm yields because of higher density per wafer and more usable parts. Not to mention 28nm has the potential to improve on many of Fermi's downsides.

No matter what, NVIDIA has some tough times ahead of them. I want them to succeed and get back on their feet running, but lately they just seem to be having the worst of luck, not to mention making some poor decisions with things like Physx and re-branding G92 indefinitely. Granted there are no horrible price/performance surprises, I still plan to buy Fermi as I find it to be a very interesting and innovative part, and believe it is a step in the direction for both GPGPU and gaming. ATI will no doubt try to mimic some of the successful features of Fermi in their next-gen part. And as I mentioned previously, bad come to worse with NVIDIA and Fermi, I see ATI's 6000 series next-gen part in my future.
 
Last edited:
...and that price was stupidly insane as well, and the reason I kept my 2 x 8800GTX's for as long as I have. :D

They've more than paid for themselves in all the years I've used them. I've got no complaints.

March is gonna be an interesting month. I want to replace my old cards with a nice new one to go with the rest of an i7 build, but I'm pretty sure I'll be going 5870 this time around unless [H] puts up a review that shows me why a 480 is worth whatever price they end up charging for it. I really prefer NVidia drivers, always have, but price/performance matters a hell of a lot too, and right now, AMD/ATI has the lead big time.

Those 8800GTXs launched at around $600, not so far from the $650 you've stated as insane and yet you bought two. I'm not saying they weren't a good deal, I just don't understand why so many people seem to think $650 is entirely unreasonable.

Although I doubt it, If GTX480 is competitive with the HD5970 I think a similar pricetag is fair.

Also, let's not forget that high prices will reduce demand for the card, which is fine for NVIDIA because they're going to be supply limited for a month or more anyway. They might as well make as much money as they can with the supply they have.
 
...I want them to succeed and get back on their feet running,...
I actually almost hope that they continue to do poorly for another few months or even a year. NVIDIA has plenty of money and there's no doubt that they could survive another year without having the best high-end graphics card. I'd just like to see AMD get more market share and gain a little bit more traction with developers - they're still the underdog and I'd love to see that balance out a bit.
...ATI will no doubt try to mimic some of the successful features of Fermi in their next-gen part...
Don't forget that GPUs take several years to design. Hecatoncheires and Northern Islands are already past the point where AMD would do major feature revisions. If Fermi has something that AMD didn't already see coming, I doubt we'll see it for another two years.
 
The pricing is all relative to performance.

If the $499 GTX 470 surpasses a 5870 by a good 30%, then I say its priced decently.
And the GTX 480 would have to equal a 5970 if its asking for a $680 price.

Only time and benchmarks will tell if the price is justified or not.

5850 = $299
5870 = $399
5870 performance = 1.1 x 5850 performance

The 5870 offers 10% extra performance for $100, or a (399/299 - 1) 33.44% increase in price.

If a GTX470 needs to be 30% faster for $100 more or a (499/399 - 1) 25.06% increase in price before you even BEGIN to call it decent, then you're smoking something. (According to ATI's pricing/performance structure. A GTX470 only needs to offer 7.5% more performance for it to be priced at $499.)

There's always an extra premium for that last couple of percent of performance/quality.

And no, the GTX480 doesn't have to be equal to 5970 performance at $680. It offers single card reliability vs. AFR & space concerns.
 
Last edited:
AMD is in an enviable position as far as graphics go.

They've got more than a half year lead over Nvidia right now with Evergreen, which will only grow larger since Fermi isn't due to launch for another month. Evergreen itself is a VERY strong line of products and they've probably made enough at this point that they could meet Fermi with an across the board price drop.

Meanwhile they're working on both a product refresh and their next generation. We can expect a refresh just around when Fermi should be widely available and another 6-8 months from then you've got Northern Islands waiting in the wings.

We haven't even heard squat about what comes after Fermi for Nvidia, at least that I can recall. If AMD can really make up some ground on the CPU side soon then they'll be in a very good place overall.
 
The 5830 isn't even $200.00 yet :D. Though it should be...

sigh. i blame nvidia. ever since the fx5200 they have really struggled.. the g80/92 being the only real highlight in the last 10 years. but even that is too weak now, and once again nvidia will probably resort to taking up 3 pci slots or something heh.
 
krupted you're speaking bollocks. Nvidia struggling ffs.

And seriously "the g80/92 being the only real highlight in the last 10 years. but even that is too weak now", well derr, its called outdated. Grow a brain.
 
5850 = $299
5870 = $399
5870 performance = 1.1 x 5850 performance

The 5870 offers 10% extra performance for $100, or a (399/299 - 1) 33.44% increase in price.

If a GTX470 needs to be 30% faster for $100 more or a (499/399 - 1) 25.06% increase in price before you even BEGIN to call it decent, then you're smoking something. (According to ATI's pricing/performance structure. A GTX470 only needs to offer 7.5% more performance for it to be priced at $499.)

There's always an extra premium for that last couple of percent of performance/quality.

And no, the GTX480 doesn't have to be equal to 5970 performance at $680. It offers single card reliability vs. AFR & space concerns.

The 5870 is more than 10% faster than the 5850.

Also

http://www.fudzilla.com/content/view/17851/1/

After couple of inquiries we found out that even though you can get estimated performance figures based on what partners are looking at right now, these are still not final as Nvidia is yet to give out a word about the final clocks of the GTX 470. The safest guess is that it will be around 20 to 25 percent faster than the Geforce GTX 285, which puts it somewhere between the HD 5850 and the HD 5870.
 
The 5870 is more than 10% faster than the 5850.

Also

http://www.fudzilla.com/content/view/17851/1/

I KNEW someone was going to say this... Okay, let's say 15% faster. Then the 470 need only be 11.24% faster than the 5870 to justify $499 using AMD's price/performance 5870 strategy.

Also

That's not conclusive evidence. I'm not making an argument that the 470 is going to be this fast or that fast. I'm just replying to some guy who said it needed to be 30% faster than a 5870 for it to justify a certain price.
 
I KNEW someone was going to say this... Okay, let's say 15% faster. Then the 470 need only be 11.24% faster than the 5870 to justify $499 using AMD's price/performance 5870 strategy.

Also

That's not conclusive evidence. I'm not making an argument that the 470 is going to be this fast or that fast. I'm just replying to some guy who said it needed to be 30% faster than a 5870 for it to justify a certain price.

I personaly think the 470 is going to be faster than the 5870.

However I don't believe for a second that the 5870 is going to stay $400 bucks when the 470 comes out.

I also don't believe that the 470 will be much faster. I think in some games the 5870 will trade blows and sometimes win against a 470.

The only reason the 5870 is $400 bucks is that there are no other dx 11 cards avalible. But when there are ati will adjust properly.

Remember no matter how you cut it , the 5870 will be cheaper than a 470 or 480. You can fit more cypress cores on the waffer , its almost 1B tranistors smaller its most likely 200mm2 smaller , uses less ram , smaller bus to memory and has been released for over 6 months by the time the gtx 4x0 come out.
 
Its not fake though.

WTF is the matter with you guys? $499 and $679 are the normal price points for Nvidia cards when they first ship...

That doesn't prove it's real. That only proves it's POSSIBLE that Nvidia will pick that price. A rumor isn't true, an assumption isn't true. Even saying "Well, it's consistant with past prices" doesn't make it true, it just at most makes it likely, but even then that doesn't make it true.

One thing to consider if you're going to equate possible and past consistancy to "true". If you're going to do THAT, they also remember that pre-order prices are often if not always price gouged also, at least from smaller fly by night stores. I never saw if XFX or PNY had a pre-order, and at least XFX doesn't now, I checked yesterday but the one I did see with those prices was some place I never even heard of before, Sabre PC. I've seen a lot of stores in the past on video cards, motherboards and other components add $50.00 to $100.00 to the retail price easily on a pre-order that's around a month before release. So, if "Oh, Nvidia priced stuff like this in the past, so it's true!' then sorry, "Pre-orders are often price gouged so thus that the price is inflated must be true" trumps that. :p

I do feel to be competitive they will need to be cheaper. BUT yes I do know it's POSSIBLE. If it is that high, oh well. If it isn't and it's performance is enough past a 5870 that it's worth getting, I may get one. If not, like I said oh well. Never been a fanboy, I just buy what I think is the best within budget when I want or need something.
 
I personaly think the 470 is going to be faster than the 5870.

However I don't believe for a second that the 5870 is going to stay $400 bucks when the 470 comes out.

I also don't believe that the 470 will be much faster. I think in some games the 5870 will trade blows and sometimes win against a 470.

The only reason the 5870 is $400 bucks is that there are no other dx 11 cards avalible. But when there are ati will adjust properly.

Remember no matter how you cut it , the 5870 will be cheaper than a 470 or 480. You can fit more cypress cores on the waffer , its almost 1B tranistors smaller its most likely 200mm2 smaller , uses less ram , smaller bus to memory and has been released for over 6 months by the time the gtx 4x0 come out.

Way to shrug off what I said and to go on a tangent.

ATI has no reason to drop the price of 5xxx, but only to get price/performance crown. It all depends on the price that nVidia chooses for GTX4xx and their respective performance. Right now, ATI has all the key mainstream prices. Even if nvidia retains the same price/performance, but is at a much higher price they won't profit as much due to the huge barrier of entry. If nvidia releases Fermi at a price such that ATI still reigns in price/performance, I'm not sure that prices will budge. There's a whole mess of things to think of; I"m not going to go around like an armchair general pretending I know specifics (including long-term plans of both companies that could affect their present judgment).
 
Way to shrug off what I said and to go on a tangent.

ATI has no reason to drop the price of 5xxx, but only to get price/performance crown. It all depends on the price that nVidia chooses for GTX4xx and their respective performance. Right now, ATI has all the key mainstream prices. Even if nvidia retains the same price/performance, but is at a much higher price they won't profit as much due to the huge barrier of entry. If nvidia releases Fermi at a price such that ATI still reigns in price/performance, I'm not sure that prices will budge. There's a whole mess of things to think of; I"m not going to go around like an armchair general pretending I know specifics (including long-term plans of both companies that could affect their present judgment).

IF the 470is priced near the 5870 and offers better performance ati will drop the price. As I said the 5870 is going to be cheaper to produce, much cheaper. So the reason why ati would drop the price is it will either stop 470 cards from selling or force nvidia to drop the price ot levels where ati makes money and nvidia doesn't.

If the 470 performs worse and costs more than there is no reason at all for them to change the price and ati . Heck if that $450 price is correct and the 470 is slower than the 5870 ati can put the 5870 to $350 and release a faster 5890 at $450 and make even more money.
 
IF the 470is priced near the 5870 and offers better performance ati will drop the price. As I said the 5870 is going to be cheaper to produce, much cheaper. So the reason why ati would drop the price is it will either stop 470 cards from selling or force nvidia to drop the price ot levels where ati makes money and nvidia doesn't.

If the 470 performs worse and costs more than there is no reason at all for them to change the price and ati . Heck if that $450 price is correct and the 470 is slower than the 5870 ati can put the 5870 to $350 and release a faster 5890 at $450 and make even more money.

What is this? A healthy company doesn't hold grudges. AMD will do what it can to make the most profits in the long-term, but they'll do what they need to survive in the short-term, even if it means collaborating with nVidia.

And the other stuff? I'm not interested in reading/discussing your platitudes anymore.
 
In all honesty, the AMD X2 4800+ in this PC has held up very well over time. Hell, I can play nearly any recent game at near max quality at 1024x768 or 1280x720 with my 7800GTX 512, so I've been making do. You can blame PS3 and XBOX 360 for gimping PC gaming and making such a thing possible.

Having zero upgrade path on Socket 939, eliminated any option of minor upgrades. If I wanted to upgrade, CPU, RAM, and Motherboard would have to be replaced, and I didn't feel like spending $500-2000 on a sub-par new PC that wouldn't last me very long and only be a minor upgrade over my current PC.

Any $600 "sub-par" rig over the last year would've allowed you to game at 1920x1200... And it'd easily last you a couple of years. I don't know about you but I'd consider a doubling of my resolution to be very substantial... Future-proofing simply doesn't work, you can spend $10,000 on a boutique build today and if it lasts 8 months more than an $800 budget build you'd be lucky... Unless you enjoy having sub-par performance for 2-3 years in between upgrades, or you simply don't game a lot on the PC or don't do anything demanding with it, which is it?

I can't imagine why on earth you would consider the GTX 200 and ATI's 4000 series crap... They were some of the best priced cards the market has seen in years, due to the closely matched release schedules and performance. The GF 8800 series were a good value for the opposite reasons, NV had no competition for so long that anyone who bought one early on got their money's worth (in part also because the number of demanding titles released in the interim was quite low).

The Intel Westmere processors are supposedly the last CPU-only part on Intel's roadmap before they go off into the CPU/GPU combo or Larrabee direction. Intels new direction after Westmere doesn't interest me very much, so going for Gulftown seems like the last chance for a traditional CPU upgrade and it is no doubt a killer part.

If you think Intel's gonna stop releasing high-end processors after Gulftown you must seriously be smoking some weird juju... Just because they're working on Larrabee (or not, given all the false starts) doesn't mean they're sleeping on the high end, ever since they got their butt kicked over the P4 they've been sticking to a tick-tock release pattern where they commit to a major upgrade of some kind every year or so, with a refresh in between.

C'mon, are you just trolling? You can't be serious...
 
You still have to admit, if anything needs a 28nm die-shrink, it's the Fermi. High costs, power consumption, heat, and even clock speeds are things it appears to be struggling with at 40nm. Even though it most likely won't arrive any time in 2010, it would be in NVIDIA's best interest move as quickly as possible to release a 28nm Fermi in 2011. If NVIDIA can actually pull off shrinking Fermi to 28nm, is a whole different question.

NVIDIA pretty much hit worse case scenario on their initial 40nm yields.

Can they really do any worse with 28nm? What's next, 0% 28nm yields and NVIDIA scrapping Fermi completely? Do you really think NVIDIA engineers are so incompetent that such a thing is possible?

You do realize that a good half of that transition is entirely out of NV's hands and on TSMC's plate instead, no?
 
Any $600 "sub-par" rig over the last year would've allowed you to game at 1920x1200... And it'd easily last you a couple of years. I don't know about you but I'd consider a doubling of my resolution to be very substantial... Future-proofing simply doesn't work, you can spend $10,000 on a boutique build today and if it lasts 8 months more than an $800 budget build you'd be lucky... Unless you enjoy having sub-par performance for 2-3 years in between upgrades, or you simply don't game a lot on the PC or don't do anything demanding with it, which is it?
I really don't game very much on the PCs I build. They are primarily workstations. If I was more into gaming, I would have at least spent $100-200 and upgraded my GPU at some point which alone should have allowed me to game at 1920x1200. I almost ended up buying a 8800GT at $150, since the G92 was a very impressive price/performance value at that time period, but ultimately didn't because I only found myself gaming a handful of hours per month on average over the course of a year. I also still use a pair of high-end CRTs as my monitors, so gaming at lower resolution isn't as bad as it would be on an LCD.

I never spent anywhere close to $10,000 on any PC. My last few builds have fallen in the $2000-4000 range. With an average of a computer every 4 years, that works out to ~$750 per year or ~$1500 every two years. Only in the past couple years could you build a half-way decent computer for ~$750. On the high-end as well, you no longer have to spend $4000-6000 for a top-notch PC. Though you still need ~$2000+ to buy something top-notch today.

The price of entry has really dropped that much, and I still find myself impressed at how cheap good performance is nowadays.

As for sub-par performance for 2-3 years, that is not entirely true. If you buy high-end, it usually works out to high-end performance from year 1-2, middle-tier performance from year 2-3, low-end performance in year 4, and completely outdated performance by year 5. This is assuming you time everything perfectly and don't miss any important industry standards or performance requirements to do various tasks. My AMD X2 has no trouble decoding 1080p H.264 video or playback of Blu-rays completely in software, and that is the only thing in recent years that I can think of which would have forced me to upgrade if my CPU didn't meet that performance requirement. The optimum time for me to upgrade is after 3 years, but I have to flexibility to wait until 5 if I timed building of my previous computer correctly around good tech and future standards.

Buying high-end and enjoying high-end performance for 1-2 years with a maximum lifespan of 5 years isn't all that bad. Is buying middle-tier and enjoying middle-tier performance for only 1 year, quickly becoming low-end and outdated during the next 2 years, or building a new computer yearly to maintain middle-tier performance really more enjoyable?

Trust me, I have been strongly considering breaking the cycle, and building new computers more often now that performance has gotten so cheap. Ultimately it all comes down to money or rather lack of money, and 3-5 year builds have historically been the only way for me to enjoy high-end performance even if only for a short time.

I can't imagine why on earth you would consider the GTX 200 and ATI's 4000 series crap... They were some of the best priced cards the market has seen in years, due to the closely matched release schedules and performance. The GF 8800 series were a good value for the opposite reasons, NV had no competition for so long that anyone who bought one early on got their money's worth (in part also because the number of demanding titles released in the interim was quite low).
What I meant is that compared to the G80 and G92, I was totally unimpressed by the GT200 series of GPUs. It really just gave off a meh feeling to me. And I didn't say the ATI 4000 series was crap. I said it didn't look that great. Which meant that even though it was more interesting then the GT200 series, it still wasn't enticing enough for me to want to go right out and buy it.

Prices may have been low, but low prices alone don't make a GPU an interesting or amazing part, On the other hand, ATI's Cypress and potentially NVIDIA's Fermi are both interesting and amazing parts in their own unique ways.

If you think Intel's gonna stop releasing high-end processors after Gulftown you must seriously be smoking some weird juju... Just because they're working on Larrabee (or not, given all the false starts) doesn't mean they're sleeping on the high end, ever since they got their butt kicked over the P4 they've been sticking to a tick-tock release pattern where they commit to a major upgrade of some kind every year or so, with a refresh in between.
Coming up soon on Intel's roadmap is Sandy Bridge. Sandy Bridge is a CPU/GPU hybrid.
Coming up soon on AMD's roadmap is Fusion. Fusion is also a CPU/GPU hybrid.

I never said that anything about performance not continuing to increase, only that Westmere may very well be the last high-end CPU-only part before they go completely hybrid with Sandy Bridge. Same goes for AMD after they transition to Fusion. Once hybrid CPU/GPUs are the the only thing available, the computing landscape may (that is a BIG question mark) look very different than today. My mentioning of Larrabee was a separate thought unrelated to these hybrid chips, but may still hint at where Intel is thinking of heading in the next 5-10 years. And as I said previously, I'm not completely sold on this entire hybrid CPU/GPU idea, so buying a Westmere now and letting all this hybrid stuff mature over a few years sounds like a great idea to me.

You do realize that a good half of that transition is entirely out of NV's hands and on TSMC's plate instead, no?
And didn't TSMC say that initial 28nm capacity should be ready by the end of 2010?

C'mon, are you just trolling? You can't be serious...
On the contrary, I believe you are the one going out of your way just to troll me. If you feel the need to continue this discussion about my PC building habits, let's not drag this thread even further off-topic. I would be happy to humor any further off-topic discussion via PM.
 
Last edited:
You still have to admit, if anything needs a 28nm die-shrink, it's the Fermi. High costs, power consumption, heat, and even clock speeds are things it appears to be struggling with at 40nm. Even though it most likely won't arrive any time in 2010, it would be in NVIDIA's best interest move as quickly as possible to release a 28nm Fermi in 2011. If NVIDIA can actually pull off shrinking Fermi to 28nm, is a whole different question.

Yea, but there is still the problem with a tiny die and all that heat...
 
I KNEW someone was going to say this... Okay, let's say 15% faster. Then the 470 need only be 11.24% faster than the 5870 to justify $499 using AMD's price/performance 5870 strategy.

Also

That's not conclusive evidence. I'm not making an argument that the 470 is going to be this fast or that fast. I'm just replying to some guy who said it needed to be 30% faster than a 5870 for it to justify a certain price.

meh. My 5870 has been running 1Ghz / 1300 since day one and I've only bumped voltage to 1.187v stock is like 1.12v (the overclocked ASUS are 1.3v). I've got a ton of headroom left. Thats already 15% overclock on the core. ATI's refresh is supposed to be a 1Ghz version.... Fermi is going to need to really trump ATI. Don't see it happening ...oh yea and I only paid $379 shipped...so I expect the Nvidia card to be the same or less. Which ain't happening.
 
Yea, but there is still the problem with a tiny die and all that heat...

Heat is never an issue. I'm sorry it just isn't. People need to get it out of their heads that:

1) Heat output has anything to do with temperature. Heat output combined with cooling system gives you the temperature. Unless you think sub ambient cooling systems actually power chips (which would be pretty cool). Yes, if you take the same thermal solution and increase the power the temperature goes up. However they don't use the exact same thermal solution on multiple GPU generations.
2) Chips need to run at 50C or 60C or 70C or 80C or even 90C.
3) Die size decreases, yes. So does power draw and voltage required. Power decreases with the square of the voltage, and size decreases with the square of the process size...
4) That the thermal engineers are anywhere near the limits of what is possible to cool. They are still making CHEAP heat sinks to go on their video cards. Until this stops it is a complete non issue. They design something that is the cheapest they can get away with that is relatively quite and keep it within the thermal spec.
 
Back
Top