Kyle.....anyone? Do you think it's possible to unlock fermi shaders?

ShuttleLuv

Supreme [H]ardness
Joined
Apr 12, 2003
Messages
7,295
I don't remember if they're hard locked or unlockable, but anyone think it's possible like the old 9500 pro? MAYBE a 50/50 chance to have full 512sp 480 if you improve cooling?
 
highly doubt it.

after the whole 6X00 fiasco, they started cutting the leads I believe.
 
IMO, Fermi needs a significant architecture improvement not a few more shaders.
 
Yeah, after unlocking became more widely known virtually all chips have hard-fused/lasered-off traces nowadays. Someone claimed that due to how quickly they supposedly switched from 512 to 480 last-minute, that they might be unlockable with a BIOS he had (OBR on evga forum) but he's not been heard from since ;)... so I'd pin the chance at a zero.
 
Yeah, after unlocking became more widely known virtually all chips have hard-fused/lasered-off traces nowadays. Someone claimed that due to how quickly they supposedly switched from 512 to 480 last-minute, that they might be unlockable with a BIOS he had (OBR on evga forum) but he's not been heard from since ;)... so I'd pin the chance at a zero.


Damn thats right. I forgot about that laser cutting crap. Well, maybe they'll be a bios mod or something. Who knows.
 
Ha, the extra shaders made the card unmanufacturable.

Nvidia had to disable the cores and raise the voltage and clocks in order to save Fermi.
Those cores are disabled for a reason. By raising the voltage and clocks they still matched the performance of the 512 core. Remember that the 512 core was clocked at 640/1250 versus the 448 core which is clocked at 700/1401. Disabling the shaders and over-volting the card made up for the loss of performance.

I wonder what will happen if it was possible to unlock those shaders. How much voltage would you need to keep it stable.
If you not satisfied with the performance buy ATI. I'm sure flashing the bios is a easy way to void your warranty.
 
Damn thats right. I forgot about that laser cutting crap. Well, maybe they'll be a bios mod or something. Who knows.
Yeah a bios will magically reconnect the etchings that were laser cut on the silicon of the chip :rolleyes: Try to educate yourself please.:D
 
If it's true that the switch from 512 shaders to 480 was so last-minute that NVIDIA chose to disable those via BIOS trickery, then there's a slim chance that a 480 that you purchase would still be functional if you used a magical hacked BIOS. But as WorldExclusive said, those shaders were disabled for a reason and there's a good chance enabling them would cause your card to artifact like crazy... even if such a BIOS were made available somehow.

If you really want a 512 shader part, wait for an "ultra" variant to come out in a few months when the yields come up or when NV has accumulated a large enough stockpile of cherry-picked chips without so many defects.
 
I have yet to see any proof that the shaders have been physically disabled, and give the amount of time they had to decide to do so I would actually say that they didn't at least on the first couple of rounds with them, this was a VERY last minute decision if reports are true. (then again they might well not be) it is true that they have made it a practice in the past to physically disable them though.

Having said that the OP should remember that the reason they changed the shaders were so they could actually sell a decent amount of them. you might be able to enable it but you might actually hurt your performance or worse enable a bad cluster or brick the card somehow. and given the shape of things you would probably get more performance by slapping a water jacket on it and OCing the hell out of it
 
Damn thats right. I forgot about that laser cutting crap. Well, maybe they'll be a bios mod or something. Who knows.

Yeah a bios will magically reconnect the etchings that were laser cut on the silicon of the chip :rolleyes: Try to educate yourself please.:D

He was saying "maybe it will be done via BIOS mod instead of a hardware mod", not that he can get around the hardware mod by messing with the BIOS. Think before you flame.
 
I have yet to see any proof that the shaders have been physically disabled, and give the amount of time they had to decide to do so I would actually say that they didn't at least on the first couple of rounds with them

Nivida did state that cores were disabled. There has been so much info the last couple of days that I can't find it. You're not going to get that kind of info from review sites, Nivida partners etc., you will have to consult a more radical source that has nothing to loose if Nvidia gets mad at them. Semi-accurate.

Reason of cores being disabled. Remember that no website with a good standing with Nvidia will point this out.
http://www.semiaccurate.com/2010/03/29/why-nvidia-hacked-gtx480/
 
That is an awesome read, I bet if Nvidia didn't set the bar so high, but more on par with ATI, it wouldn't of been such a flop, but just more competition. Now it's overpriced, overpowered competition. ATI is cheaper, and uses less power vs. Nvidia. I bet they're still working on tweaking their cards, and will come out with a more user friendly card later on this year, and ATI will have their next gen out, and by then Nvidia will finally go "We got it! It beats your 5800 series!, oh shit, you guys have new cards."
 
I wonder if I can unlock the turbo charger and 2 extra cylinders on my sentra to make it a GT-R.
 
I have yet to see any proof that the shaders have been physically disabled, and give the amount of time they had to decide to do so I would actually say that they didn't at least on the first couple of rounds with them, this was a VERY last minute decision if reports are true. (then again they might well not be) it is true that they have made it a practice in the past to physically disable them though.

Having said that the OP should remember that the reason they changed the shaders were so they could actually sell a decent amount of them. you might be able to enable it but you might actually hurt your performance or worse enable a bad cluster or brick the card somehow. and given the shape of things you would probably get more performance by slapping a water jacket on it and OCing the hell out of it

It will be very interesting to see, but I'm thinking it's unlikely they were able to physically disable the extra shaders if the decision was made late in the game to go with 480. I'm no chip expert, but it seems like that would require rerunning all the chips -- not cost-effective or possible when you've got a couple week's before release.

On the other hand, numbers are supposed to be so few, it's possible they were able to put the dogs back on the operating table for castration at the last minute.
 
I wonder if I can unlock the turbo charger and 2 extra cylinders on my sentra to make it a GT-R.

But you could theoretically get a more powerful engine into that sentra... maybe putting a GTX480 chip onto a GT210 PCB... IDEA :eek:
 
It will be very interesting to see, but I'm thinking it's unlikely they were able to physically disable the extra shaders if the decision was made late in the game to go with 480. I'm no chip expert, but it seems like that would require rerunning all the chips -- not cost-effective or possible when you've got a couple week's before release.

On the other hand, numbers are supposed to be so few, it's possible they were able to put the dogs back on the operating table for castration at the last minute.

I wonder as well, the other argument here being that they would have had to have done some kind of testing to be able to find the weaker clusters. still I have no idea. but I am thinking your right, even if the cost wasn't an issue time surely was.
 
Quite likely the first run chips are not physically disabled - however I'm sure future runs will be.
 
Hey, do you think that if you lower the voltage to the 480, and then enable the shaders, (IF they're able to be enabled...) it'll be stable?
 
Remember that the 480 core 700/1401 and 512 core 625/1250 has the same performance.

Lowering the clock speeds and voltage will decrease performance, enabling the shaders will restore the performance that was lost.
Disabling the shaders decreases performance, increasing the clock speeds and voltage restores the performance that was lost.

Nvidia had to disable the shaders because the part was too hot and unmanufacturable. When the shaders were disabled, they increased the voltage and were still able to top the 5870. If the voltage and clocks couldn't be increased, Fermi would have been slower than the 5870.
Nvidia was willing to have a high TDP and increase their voltages to have the fastest card on the market. It was important to them that Fermi achieve that goal.

Now the beast card Nvidia wanted to make was the 512 core @ 750/1500. It would have destroyed the 5870 but unfortunately it wasn't possible. With a respin or new generation of cards we may still see this card happen.
 
Didnt one of the reviews mention a block of shaders to be disable to yield more sellable cards
 
Yeah a bios will magically reconnect the etchings that were laser cut on the silicon of the chip :rolleyes: Try to educate yourself please.:D

Heh. Go easy on Shuttleluv, he and goldentiger are the last two surviving members of the Nvidia fanboys...errmm I mean fan club.
 
If they cut them the way they cut the Athlon XPs it would be possible to unlock them, I suppose.
 
I think it would've been a good idea for nVidia to softlock shaders, i mean they'd get a few more sales from people, maybe even steal a few more ATi users. They wouldn't have much, if anything, to lose in doing so. If you blow it up or screw the flash then it's your fault huh, go buy another.
But yes i can see doubt in softlocking of the cores to be honest, wait and see huh? Just dont get your hopes up.
 
It will be very interesting to see, but I'm thinking it's unlikely they were able to physically disable the extra shaders if the decision was made late in the game to go with 480. I'm no chip expert, but it seems like that would require rerunning all the chips -- not cost-effective or possible when you've got a couple week's before release.

On the other hand, numbers are supposed to be so few, it's possible they were able to put the dogs back on the operating table for castration at the last minute.

I don't think they have "phsically disabled" the cores per se, but rather were unable to come up with a significant yield with 512 cores. Therefore they had to settle for less.

I mean, if there were 512 cores physically, why would they disable them? It's like putting money and effort into developing a v6 engine and then artifically disabling 2 cylinders...
 
I say wait until people start getting retail cards and then see if someone is crazy enough to try to unlock additional cores :)
 
maybe nvidia have a secert plan, every driver update they will unlock 1 extra core.
 
I don't think they have "phsically disabled" the cores per se, but rather were unable to come up with a significant yield with 512 cores. Therefore they had to settle for less.

I mean, if there were 512 cores physically, why would they disable them? It's like putting money and effort into developing a v6 engine and then artifically disabling 2 cylinders...

Their are 512 cores on each chip. Their goal was to make a lot of fast cards. They found they could fuse off some bad cores and get a huge bump in clock speed. The linear approximation of simliar architecture speed is given by number of cores X clock speed. So if they had 512 cores at 600mhz that comes out to 307200. If they drop the cores to 480 but get a clock bump to 700 the same math comes out to 336000 or 9.4% faster AND have higher yields.

Statistically speaking there will be a chip out there that had no defects on it's cores, and could probably hit 900mhz at 512 cores. However, there's only going to be one like that. (would be a VERY nice card)

But in reality, they now need to disable these "bad" cores that limit clock speeds and/or are simply non-functioning. They will disable them in the bios. The real question is if they are also doing phyiscal modifications to prevent you from turning those cores back on. My guess is they aren't. Or at least they aren't yet.

The only reason to spend the money to fuse off the extra cores is if you are worried someone will unlock them AND it will undercut the sale of another product. Seeing as they currently have no "higher" product to undercut there isn't much worry. If/When yields increase and they can produce a 512 core chip that runs at 700mhz and call it a 485 (for the lack of a better name), THEN I'd expect to see the 480s being fused. But only if they are binning a large quantity of chips that could be unlocked to the "485 status".
 
I don't think they have "phsically disabled" the cores per se, but rather were unable to come up with a significant yield with 512 cores. Therefore they had to settle for less.

I mean, if there were 512 cores physically, why would they disable them? It's like putting money and effort into developing a v6 engine and then artifically disabling 2 cylinders...

actually if you can ignore the venom charlie have it that well explained here http://www.semiaccurate.com/2010/03/29/why-nvidia-hacked-gtx480/

he makes a good argument that the over all performance of the card is improved by disabling the weaker clusters on the cards in order to gain more clock speed
 
Their are 512 cores on each chip. Their goal was to make a lot of fast cards. They found they could fuse off some bad cores and get a huge bump in clock speed. The linear approximation of simliar architecture speed is given by number of cores X clock speed. So if they had 512 cores at 600mhz that comes out to 307200. If they drop the cores to 480 but get a clock bump to 700 the same math comes out to 336000 or 9.4% faster AND have higher yields.

Statistically speaking there will be a chip out there that had no defects on it's cores, and could probably hit 900mhz at 512 cores. However, there's only going to be one like that. (would be a VERY nice card)

Well said.;) If they can increase the clocks and maintain the 512 shaders, this card would have been mind-blowing.;) 5970 like performance.
The people who are asking for 512 version should reconsider since the 480 version is better and cooler believe it or not.
 
Well said.;) If they can increase the clocks and maintain the 512 shaders, this card would have been mind-blowing.;) 5970 like performance.
The people who are asking for 512 version should reconsider since the 480 version is better and cooler believe it or not.

It would be interesting to see a LN2 (or better) cooled attempt with high voltages on one of these things.
 
I thought about waiting for a 512 but I don't think we''ll see it till next year if that.
 
Yeah if anyone finds a way to unlock those. It's a meltdown for sure, unless someone is using water. Then it either works or they have water boiler.
 
Well one thing to draw from this is that my GTX 280 is plenty good enough at least until they release the inevitable refresh with not-absurd wattage, temps, and all cores enabled.
 
Even if they do refresh, you might not see an wattage decrease. Like the GTX280 to GTX285. Instead of saving watts with the die shrink, they clocked the cards higher and kept the wattage about the same. This time around though maybe they could improve the number of usable cores with a refresh.
 
Even if they do refresh, you might not see an wattage decrease. Like the GTX280 to GTX285. Instead of saving watts with the die shrink, they clocked the cards higher and kept the wattage about the same. This time around though maybe they could improve the number of usable cores with a refresh.

I don't know, they can't be happy with this as it stands. you just can't have a card with a 10% performance advantage take more power then two of your competitors cards. I think that if they could just make it work as intended they could make it work (maybe not profitably for a while but work)
 
Back
Top