mscsniperx
n00b
- Joined
- Jan 24, 2004
- Messages
- 7
Moore's law originally offered some observations relating IC density increase over time. This has been used to predict PC CPU performance. Moores law has been used to show Processor speed should double every 18 months. Often this law was exceeded over the past few years. Even more exciting for consumers, as Moores law was being met and exceeded, the relative price of the cutting edge CPU's has dropped..
BUT, WHAT ABOUT THE GRAPHICS CARD INDUSTRY?
Graphics cards also follow the same general foundations of PC CPU's. They are in effect GRAPHICAL CPUS, or (GPUs) and are used for crunching graphical data. One would expect therefore Graphics Cards should follow Moore's law along a similar course as CPU's have. Wow! you are in for a nasty surprise!
Today the ATI 9800XT is held as the fastest video card on the consumer market... But has the 9800 PRO followed Moore's law? After some observations the answer becomes a BIG FAT "HECK NO" Even worse, not only has the graphics card industry fallen far short of Moore's law but the prices have been INCREASING.. In effect you are paying FAR more and getting FAR less when compared to CPU performance & price characteristics over time.
Consider this...
ATI's R300 (9700 pro) debuted in July 2002...
The 9800XT is currently the fastest card on the market as of February 2004
The 9800 Pro has been shown through real world benchmarks to roughly be only 12% faster then 9700 pro [SOURCE: TOMSHARDWARE.COM BENCHMARKS].. (these figures change when doing antialising)... But antialiasing is a feature rarely if ever used by the end user.. the quality gains are minimal compared to performance loss..
Do the math... 12%/(19months) = 7.6%/YEAR Performance gain for Video Cards..
Moore's law = 100%/18 months = 67%/YEAR
(CPUs have met or exceeded Moore's law to date)
In other terms.. the Graphics card industry has met only 1/9 th the criteria of Moores law .. And if this isn't bad enough.. Graphics cards have been increasing in price! The top of the line consumer graphics cards now top out at 500$!!!! Thats a 25% increase over previous years highest end model!
500$ to get 1/9 th of Moore's law?! How sad.
Now lets put it ALL IN comparison..
1/9 th Moore's law and 25% price increase.
Comparing to price & performance structure of PC CPU's
..If PC's were to be normalized to fit the trend of graphics cards..
A CPU would cost you.. 9 *(current CPU) + 25%
So, for a P4 3.4gig Hz @ $459 , current selling rate..
A P4 3.4gig today should cost us $5,100 !
Conversely if Graphics cards followed CPU's price performance structure, A 9800pro today only be worth $55
There are many technical reasons why the disparity.. but it is rather disturbing none the less . Many other facets of IC based architecture have been following or beating Moore's Law.. YET- the most sought after and NEEDED performance gains in computers these days IS GRAPHICS! Yet it is Graphics that lags behind (BIG TIME)
This translates into new generation games running at disappointing frame rates.... the newer games like FarCry, (demo released) doom3,HL2 etc completely SMOKE the Fastest 500$ cards on the market getting choppy frames rates in the 40's. Perhaps its time the consumer started asking why. Why in the Graphics Industry, they are paying for more and getting less when compared to their siblings in the IC industry. While we praise AMD and INTEL for exceeding Moores law, perhaps its time we asked why Graphics lag so far behind.
BUT, WHAT ABOUT THE GRAPHICS CARD INDUSTRY?
Graphics cards also follow the same general foundations of PC CPU's. They are in effect GRAPHICAL CPUS, or (GPUs) and are used for crunching graphical data. One would expect therefore Graphics Cards should follow Moore's law along a similar course as CPU's have. Wow! you are in for a nasty surprise!
Today the ATI 9800XT is held as the fastest video card on the consumer market... But has the 9800 PRO followed Moore's law? After some observations the answer becomes a BIG FAT "HECK NO" Even worse, not only has the graphics card industry fallen far short of Moore's law but the prices have been INCREASING.. In effect you are paying FAR more and getting FAR less when compared to CPU performance & price characteristics over time.
Consider this...
ATI's R300 (9700 pro) debuted in July 2002...
The 9800XT is currently the fastest card on the market as of February 2004
The 9800 Pro has been shown through real world benchmarks to roughly be only 12% faster then 9700 pro [SOURCE: TOMSHARDWARE.COM BENCHMARKS].. (these figures change when doing antialising)... But antialiasing is a feature rarely if ever used by the end user.. the quality gains are minimal compared to performance loss..
Do the math... 12%/(19months) = 7.6%/YEAR Performance gain for Video Cards..
Moore's law = 100%/18 months = 67%/YEAR
(CPUs have met or exceeded Moore's law to date)
In other terms.. the Graphics card industry has met only 1/9 th the criteria of Moores law .. And if this isn't bad enough.. Graphics cards have been increasing in price! The top of the line consumer graphics cards now top out at 500$!!!! Thats a 25% increase over previous years highest end model!
500$ to get 1/9 th of Moore's law?! How sad.
Now lets put it ALL IN comparison..
1/9 th Moore's law and 25% price increase.
Comparing to price & performance structure of PC CPU's
..If PC's were to be normalized to fit the trend of graphics cards..
A CPU would cost you.. 9 *(current CPU) + 25%
So, for a P4 3.4gig Hz @ $459 , current selling rate..
A P4 3.4gig today should cost us $5,100 !
Conversely if Graphics cards followed CPU's price performance structure, A 9800pro today only be worth $55
There are many technical reasons why the disparity.. but it is rather disturbing none the less . Many other facets of IC based architecture have been following or beating Moore's Law.. YET- the most sought after and NEEDED performance gains in computers these days IS GRAPHICS! Yet it is Graphics that lags behind (BIG TIME)
This translates into new generation games running at disappointing frame rates.... the newer games like FarCry, (demo released) doom3,HL2 etc completely SMOKE the Fastest 500$ cards on the market getting choppy frames rates in the 40's. Perhaps its time the consumer started asking why. Why in the Graphics Industry, they are paying for more and getting less when compared to their siblings in the IC industry. While we praise AMD and INTEL for exceeding Moores law, perhaps its time we asked why Graphics lag so far behind.