Graphics Cards Fall FAR SHORT of MOORE'S LAW!

Originally posted by siegecraft4
oh boy... this thread is a mess.... first of all Moore's law is not a LAW but a THEORY ... Moore isn't going to come and punish graphics card companies for not following his theory .... he based it on trends in the industry during his time ... times have changed ... i think people should really stop trying to apply this ancient theory to modern times ... don't you think graphics chipset companies are driven to have the fastest product so they can have a greater market share amongst enthusiasts ... if they could develop gpus any faster they would

however i do agree that its not worth it to upgrade as often anymore... there have been no huge breakthroughs since r300

and believe me ... i don't know too much about Moore's law but think that any person with common sense can see that this argument is moot

Have it gone 18 months since R300 was released. But yes Moores law was only meant to be applied to CPU:s AFAIK.
 
The article here is a sarcastic look at Moores law .. Before you fly off the handle getting your pocket protectors in a snizle saying you are supreme geek and I don't know jack (you can keep your supreme nerd status btw)

The point is Moores law (or theory rather).. was originally used as a theory to predict increases in IC density over time.


Marketing and the general public then took this theory and began applying Moore's theory as a LAW dictating CPU performance would double ever 18 months.


This was NEVER part of Moores theories.. but now you see it mainstream to say , Well PC's speed and performance double every 18 months and are beating Moores law.

To see how ridiculous this is lets see what happens when we apply this over hyped Bastardized theory to the GPU's...

GPU's are far from the END performance gains compared to CPUS..

The 9700 pro which first came out about 18 months ago compared to the 9800 xt shows nothing CLOSE to a 100% performance gain. Rather 20% or less in average REAL world benchmarks in FPS..
 
Originally posted by mscsniperx
The article here is a sarcastic look at Moores law .. Before you fly off the handle getting your pocket protectors in a snizle saying you are supreme geek and I don't know jack (you can keep your supreme nerd status btw)

The point is Moores law (or theory rather).. was originally used as a theory to predict increases in IC density over time.


Marketing and the general public then took this theory and began applying Moore's theory as a LAW dictating CPU performance would double ever 18 months.


This was NEVER part of Moores theories.. but now you see it mainstream to say , Well PC's speed and performance double every 18 months and are beating Moores law.

To see how ridiculous this is lets see what happens when we apply this over hyped Bastardized theory to the GPU's...

GPU's are far from the END performance gains compared to CPUS..

The 9700 pro which first came out about 18 months ago compared to the 9800 xt shows nothing CLOSE to a 100% performance gain. Rather 20% or less in average REAL world benchmarks in FPS..

But since 9800 XT has been out for several months shouldn´t we compare the 9700 PRO to the upcoming R420? That would be a much more interesting comparison.
 
The R420 is hinting at double performance gains over the 9800..

Wouldn't that be nice?

The problem here is some people are arguing, well at so and so operation utilization of the Z buffer allowed the 9800 to be twice as quick.. bla bla bla..

The end performance gains in FPS has historically fallen quite short of marketing hype.. GPU real world overall Gain ,while consistant, pale in comparison to CPU gains.. (at least thus far..)

Notice I say REAL WORLD.. on a lower level mips/flops .. the story becomes very different..

GPU's in themselves are remarkably fast.. faster then CPU's in some cases.. Indeed, the future of high end computaional programs may actually call upon the processing speed of the GPU.. Actually such programming language interfaces already exist..
 
GPU capabilities are increasing faster than the power of the CPU and is exceeding Moore's Law.

Moore's Law was actually formulated as follows, "The complexity for minimum component costs has increased at a rate of roughly a factor of two per year (see graph on next page). Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years."

Rewriting this, "The number of transistors per chip that yields the minimum cost per transistor has increased at a rate of roughly a factor of two per year." (Arstechnica, 2003).

http://arstechnica.com/paedia/m/moore/moore-1.html

Read up folks :)
 
Back
Top