Graphics Cards Fall FAR SHORT of MOORE'S LAW!

mscsniperx

n00b
Joined
Jan 24, 2004
Messages
7
Moore's law originally offered some observations relating IC density increase over time. This has been used to predict PC CPU performance. Moore’s law has been used to show Processor speed should double every 18 months. Often this law was exceeded over the past few years. Even more exciting for consumers, as Moore’s law was being met and exceeded, the relative price of the cutting edge CPU's has dropped..


BUT, WHAT ABOUT THE GRAPHICS CARD INDUSTRY?

Graphics cards also follow the same general foundations of PC CPU's. They are in effect GRAPHICAL CPUS, or (GPUs) and are used for crunching graphical data. One would expect therefore Graphics Cards should follow Moore's law along a similar course as CPU's have. Wow! you are in for a nasty surprise!

Today the ATI 9800XT is held as the fastest video card on the consumer market... But has the 9800 PRO followed Moore's law? After some observations the answer becomes a BIG FAT "HECK NO" Even worse, not only has the graphics card industry fallen far short of Moore's law but the prices have been INCREASING.. In effect you are paying FAR more and getting FAR less when compared to CPU performance & price characteristics over time.

Consider this...

ATI's R300 (9700 pro) debuted in July 2002...

The 9800XT is currently the fastest card on the market as of February 2004

The 9800 Pro has been shown through real world benchmarks to roughly be only 12% faster then 9700 pro [SOURCE: TOMSHARDWARE.COM BENCHMARKS].. (these figures change when doing antialising)... But antialiasing is a feature rarely if ever used by the end user.. the quality gains are minimal compared to performance loss..


Do the math... 12%/(19months) = 7.6%/YEAR Performance gain for Video Cards..

Moore's law = 100%/18 months = 67%/YEAR
(CPU’s have met or exceeded Moore's law to date)

In other terms.. the Graphics card industry has met only 1/9 th the criteria of Moore’s law .. And if this isn't bad enough.. Graphics cards have been increasing in price! The top of the line consumer graphics cards now top out at 500$!!!! That’s a 25% increase over previous years highest end model!

500$ to get 1/9 th of Moore's law?! How sad.

Now lets put it ALL IN comparison..

1/9 th Moore's law and 25% price increase.
Comparing to price & performance structure of PC CPU's

..If PC's were to be normalized to fit the trend of graphics cards..

A CPU would cost you.. 9 *(current CPU) + 25%

So, for a P4 3.4gig Hz @ $459 , current selling rate..

A P4 3.4gig today should cost us $5,100 !

Conversely if Graphics cards followed CPU's price performance structure, A 9800pro today only be worth $55

There are many technical reasons why the disparity.. but it is rather disturbing none the less . Many other facets of IC based architecture have been following or beating Moore's Law.. YET- the most sought after and NEEDED performance gains in computers these days IS GRAPHICS! Yet it is Graphics that lags behind (BIG TIME)
This translates into new generation games running at disappointing frame rates.... the newer games like FarCry, (demo released) doom3,HL2 etc completely SMOKE the Fastest 500$ cards on the market getting choppy frames rates in the 40's. Perhaps it’s time the consumer started asking why. Why in the Graphics Industry, they are paying for more and getting less when compared to their siblings in the IC industry. While we praise AMD and INTEL for exceeding Moore’s law, perhaps it’s time we asked why Graphics lag so far behind.
 
Originally posted by mscsniperx
Moore's law originally offered some observations relating IC density increase over time. This has been used to predict PC CPU performance. Moore’s law has been used to show Processor speed should double every 18 months. Often this law was exceeded over the past few years. Even more exciting for consumers, as Moore’s law was being met and exceeded, the relative price of the cutting edge CPU's has dropped..


BUT, WHAT ABOUT THE GRAPHICS CARD INDUSTRY?

Graphics cards also follow the same general foundations of PC CPU's. They are in effect GRAPHICAL CPUS, or (GPUs) and are used for crunching graphical data. One would expect therefore Graphics Cards should follow Moore's law along a similar course as CPU's have. Wow! you are in for a nasty surprise!

Today the ATI 9800XT is held as the fastest video card on the consumer market... But has the 9800 PRO followed Moore's law? After some observations the answer becomes a BIG FAT "HECK NO" Even worse, not only has the graphics card industry fallen far short of Moore's law but the prices have been INCREASING.. In effect you are paying FAR more and getting FAR less when compared to CPU performance & price characteristics over time.

Consider this...

ATI's R300 (9700 pro) debuted in July 2002...

The 9800XT is currently the fastest card on the market as of February 2004

The 9800 Pro has been shown through real world benchmarks to roughly be only 12% faster then 9700 pro [SOURCE: TOMSHARDWARE.COM BENCHMARKS].. (these figures change when doing antialising)... But antialiasing is a feature rarely if ever used by the end user.. the quality gains are minimal compared to performance loss..


Do the math... 12%/(19months) = 7.6%/YEAR Performance gain for Video Cards..

Moore's law = 100%/18 months = 67%/YEAR
(CPU’s have met or exceeded Moore's law to date)

In other terms.. the Graphics card industry has met only 1/9 th the criteria of Moore’s law .. And if this isn't bad enough.. Graphics cards have been increasing in price! The top of the line consumer graphics cards now top out at 500$!!!! That’s a 25% increase over previous years highest end model!

500$ to get 1/9 th of Moore's law?! How sad.

Now lets put it ALL IN comparison..

1/9 th Moore's law and 25% price increase.
Comparing to price & performance structure of PC CPU's

..If PC's were to be normalized to fit the trend of graphics cards..

A CPU would cost you.. 9 *(current CPU) + 25%

So, for a P4 3.4gig Hz @ $459 , current selling rate..

A P4 3.4gig today should cost us $5,100 !

Conversely if Graphics cards followed CPU's price performance structure, A 9800pro today only be worth $55

There are many technical reasons why the disparity.. but it is rather disturbing none the less . Many other facets of IC based architecture have been following or beating Moore's Law.. YET- the most sought after and NEEDED performance gains in computers these days IS GRAPHICS! Yet it is Graphics that lags behind (BIG TIME)
This translates into new generation games running at disappointing frame rates.... the newer games like FarCry, (demo released) doom3,HL2 etc completely SMOKE the Fastest 500$ cards on the market getting choppy frames rates in the 40's. Perhaps it’s time the consumer started asking why. Why in the Graphics Industry, they are paying for more and getting less when compared to their siblings in the IC industry. While we praise AMD and INTEL for exceeding Moore’s law, perhaps it’s time we asked why Graphics lag so far behind.


Well I have no problem playing Far Cry on my ancient 9700 PRO ;) Framerate in the 40:s aren´t choppy. Have you ever heard of framerates in the 10:s ;)

Believe me I am quite sure the video card chip makers pull every effort making as fast video cards as possible.

But you seem to forget one obvious thing.

Image quality??? Do the image quality doubles with every new CPU ;)

With every new generation I have seen the image quality increase considerable. Voodoo 2 to Geforce 2. Geforce 4 to 9700 PRO. R420 will probably have even better image quality.

But if video cards consisted solely of a GPU and nothing else I am sure they would follow Moores law much better. But video cards are much more complex than CPU and memory by nature. That is why they are so expensive and since they require a lot more space than a CPU or memory it´s harder cramming even more transistors and such into the current AGP specs.
 
Oh and there was a 100 % speed increase from a ti-4600 to a 9700 PRO under the settings I wanted to play in...
 
"Graphics cards have never been under Moore's law. It only applies to CPU's."

That was my point, or more to the fact.. we are paying 500$ to gain only 12% increase in performance over a product life cycle of 18 months for Graphics processors..

$500 fo ony 12% ! Very sad...


Im interested in knowing WHY GPU's fail so far short of performance gains compared to CPU's .. and for that matter even other PC components which seem to follow or surpass Moores law..
 
That cut and paste article is dumb. It's comparing a law based on the *number of transistors* doubling every 18 months at roughly the same price. The CPU, in isolation may also be twice as fast (MIPS rating or similar low level benchmark, taking advantage of any new CPU strengths through recompilation/optimized compilers), but overall system performance doesn't double every 18 months. That's his mistake. He's taking a benchmark that is CPU/system limited and/or doesn't take advantage of new GPU features and comparing that to CPU in isolation performance increases. Remember when the P4 was slow? Optimized software changed that a lot, even on the original Willamette CPUs.

The graphical power of several GPUs has exceeded Moore's law. Just as in the case of overall system performance with a CPU upgrade, you don't get a doubling of performance by just upgrading the video card. Low level benchmarks show the processing power is improving quite nicely. It's funny that he chooses to ignore that very point with his FSAA/AF comment. The processing of sub-samples is one of the easiest ways to show how much performance is improving. If he was thinking, running programs that have both DX8 and DX9 modes, in addition to running with and without AF/FSAA, would show an even bigger difference between generations. And that kind of test would also be closer to an apples-to-apples comparison with the way CPU performance has increased is measured.

But that would require some actual testing and understanding of what is being measured with Moore's Law, which the author is obviously incapable of. That almost sounds like a dumb post Ed at OCers would make.

edit: if you wrote that yourself, I'm really sorry for comparing you to Ed. I don't want to sound that cruel.
 
But antialiasing is a feature rarely if ever used by the end user.. the quality gains are minimal compared to performance loss.

Ummm... I think that most everyone that buys a 9800xt uses AA/AF. Why would you buy a $500 card and not use it to its potential? Oh and "the quality gains are minimal compared to performance loss"? :D
 
Moores law is used to show gains in density using silicon subtrates.. It is broad albeit often used analogy under Moores law comparison to refer to the processing power of the IC under consideration..

I do know what Moores law. The Article is a sarcastic look at applying the Moores law of so called performance doubling every 18 months to GPU's. The article points out how far GPU's fall short of such a comparison and seeks to raise some questions..

If we assume to link Moore's law to CPU's and Performance we praise CPU's how they are "BREAKING" or "GOING BEYOND" Moores Law..

If we were to do the same for GPU's.. linking perfomance to doubling every 18monts the analogy falls far short.. I think that is point of the article..


What is a FACT the overall average of PERFORMANCE of todays highest end GPU's and 18 MONTHS ago is roughly 12% .. 12%!!!

How does a 12% gain in 18 months for 500$ impress you??


Now to the argument oh welll,, the 12% is dependant on CPU speed also.. well then fine.. DID I mention that 12% is using todays fastest CPUS? If you were to take the fastest GPU of today and test end result performance on a CPU of 18 months ago the result would be LESS then 12%..

If anything the CPU's have been been accounting for EXAGERATING the end performance of GPU's not harming them..

BTW.. the 9700 PRO had 110 million transistors the 9800xt has 125 million..

The point here is the REAL performance gains of GPU's over the past 18 months is a sad figure at best..



"Low level benchmarks show the processing power is improving quite nicely"

And what low lever benchmarks are those? and how are they so nice?
 
Originally posted by mscsniperx
"Graphics cards have never been under Moore's law. It only applies to CPU's."

That was my point, or more to the fact.. we are paying 500$ to gain only 12% increase in performance over a product life cycle of 18 months for Graphics processors..

$500 fo ony 12% ! Very sad...


Im interested in knowing WHY GPU's fail so far short of performance gains compared to CPU's .. and for that matter even other PC components which seem to follow or surpass Moores law..

We have always payed 500$ for the latest and greatest. I don´t see why everyone seem to think the video card prices has gone up lately? I remember when the Geforce 2 GTS was new it cost about 500$, Geforce 3 even more, Geforce 4 about 500$?????????

So they actually get cheaper and cheaper. More performance for the same money!

And you can´t compare the 9800 PRO to the 9800 XT. They are essentially the same and there was definiatly not 18 months between those.
 
moore's law didn't state processor speeds doubled..

it stated transistor density doubled, and with that speeds have almost doubled...
 
the comparison is between the 9700 pro and the 9800Xt.. and there HAS been 18 months between the two..


Sorry, graphics card prices have been increasing.. 500$ is NOT the norm.. the leading edge cards of several years ago did NOT cost 500$, they were 400$ ...

the 500$ cost true wasn't introduced by 9800Xt but the comparison is being made between ... the

9700pro which cost 400$ in 2002

9800pro which cost 500$ in 2004


The time is 18 months.. the Overall perfomance gains are roughly 12%...

CPU's performance gains have been 100% or greater over past 18 months..

MOORES law as aplied to CPU's infers the PERFORMANCE doubles every 18 months...



sorry, GPUS performance / price ratios over time are pitifull compared to CPU'S.. don't know why everyone is defending it..
 
Originally posted by mscsniperx
sorry, GPUS performance / price ratios over time are pitifull compared to CPU'S.. don't know why everyone is defending it..


Because you really don't know what you're talking about and so they're trying to talk some sense into you. :rolleyes:
 
Originally posted by mscsniperx
the comparison is between the 9700 pro and the 9800Xt.. and there HAS been 18 months between the two..


Sorry, graphics card prices have been increasing.. 500$ is NOT the norm.. the leading edge cards of several years ago did NOT cost 500$, they were 400$ ...

the 500$ cost true wasn't introduced by 9800Xt but the comparison is being made between ... the

9700pro which cost 400$ in 2002

9800pro which cost 500$ in 2004


The time is 18 months.. the Overall perfomance gains are roughly 12%...

CPU's performance gains have been 100% or greater over past 18 months..

MOORES law as aplied to CPU's infers the PERFORMANCE doubles every 18 months...



sorry, GPUS performance / price ratios over time are pitifull compared to CPU'S.. don't know why everyone is defending it..

I agree there hasn´t happened much between the 9700 PRO and 9800 XT but that is mainly because the 9700 PRO was such a killer card.

But the difference is larger than 12 % though. Perhaps like 20??

And have you ever heard of inflation? Believe me when the Gf 3 was new it cost approx 500$ Hell even the rehash Geforce 3 ti-500 costed about that much when it was new.

And subtract the video card memory of today to the one find on the video cards on the past and they are cheaper today then in the past.

Defending or defending it´s to much to ask that video card makers are supposed to sell their cards for a loss...

I mean if you just go one step under the latest and fastest you will see they aren´t that expensive. The latest and greatest have always cost this much like it or not.

There are worse offenders you know? Intel wanting 900$ or even more for a P 4 3.2 EE??? And that got no memory onboard except that mega large 2 mb L2 cache.

A64 while being a bit cheaper and faster isn´t much better either.
 
I understand what you are talking about. Although I think the performance has increased more then what you are saying though. What pisses me off is that highend graphics are costing more and more these days. I remember when I got my TNT2 Ultra back in 98 or 99, cant really remember when, but it was a top of the line card and it cost under $300 bucks at compusa. Now a top of the line card costs over $400 bucks. That pisses me off.
What really pisses me off though is that the mid range cards are so much different then their higher end brothers. The 9600 is a POS compared to the 9800. I miss the days of the highend cards being the same as the mid range cards only with a much higerclock speed. For example whats the difference between a Geforce 4600 and a 4200? Other then clockspeed not much huh?

Also... Inflation has nothing to do with the cost of graphics cards price increasing. Inflation is what 1%-2% at the most?
 
You people have WAY too much time on your hands. I'm not even joking either.

-warsaw
 
Originally posted by TMCM
I understand what you are talking about. Although I think the performance has increased more then what you are saying though. What pisses me off is that highend graphics are costing more and more these days. I remember when I got my TNT2 Ultra back in 98 or 99, cant really remember when, but it was a top of the line card and it cost under $300 bucks at compusa. Now a top of the line card costs over $400 bucks. That pisses me off.
What really pisses me off though is that the mid range cards are so much different then their higher end brothers. The 9600 is a POS compared to the 9800. I miss the days of the highend cards being the same as the mid range cards only with a much higerclock speed. For example whats the difference between a Geforce 4600 and a 4200? Other then clockspeed not much huh?

Also... Inflation has nothing to do with the cost of graphics cards price increasing. Inflation is what 1%-2% at the most?

Inflation is more than 1-2 %. But if there is such a difference in performance between 9600 and 9800 why would they cost the same ;)

Look at AMD. I bet they lost a lot of money on the 2500+ since it was so overclockable everybody got that and overclocked it to 3200+ speeds. Now they have locked the multiplier to limit the overclockability somewhat but it´s still a steal.

I don´t see what we should be pissed about? Todays video card have at least 128 and in term of highend 256 mb on board RAM. That has to cost some extra wouldn´t it?? In fact that explains why the video cards today are max 10 % more expensive than in the past. And since you noted that there isn´t much difference in speed between a 9800XT and 9800 PRO... A 9800 PRO can be had for what 300$???

In that point of view it has gotten cheaper with video cards today than in the past.
 
Compare ATI's r420 (which should debut right around 18 months after the r300) and see if it doubles the performance in the highest stress areas (which is the only way to measure how much power a gpu really has). I would be very surprised if it's not 100% faster. Compare the 8500 to the 9700Pro, (8500 came out around 0ctober 2000, 9700pro intro'd July 2002, roughtly 20 months). The performance difference was way more than 100%, especially in stressful situations. Of course in quake3 where the gpu isn't stressed you're not going to see 100% increases, but in situations where it really relies on the gpu, you will. If you chart the increase from the 8500 at 4xaa and 16x af to the 9700pro at 4xaa and 16x af, you'll see that in 20 months the increase was closer to 300%, coming out to roughtly 285% for 18 months. I think that far exceeds Moore's law. You can pick out any 1 time interval that doesn't meet the expectations (waaah, 6 months after the 9700pro came out we didn't get a 35% performance increase), but if you look at it historically, it still holds up (and then some) and probably still will when the r420 hits (and then some).
 
The 9700 was a huge leap foward..

and my point.. since then, we haven't seen much at all in the way of progress... same basic core pretty much, just pumped up speed..

8500 came out around 0ctober 2000, 9700pro intro'd July 2002, roughtly 20 months). The performance difference was way more than 100%, especially in stressful situations

Where do you see that?? And to say 16x antialiasing gives a 300% increase in performance therefore the card is 300% quicker dosen't follow practical use.. No one runs 16x antialising unless you can live with 30 fps on modern games.. Even so, your talking about ONE function that frankly produces very little if not worse graphics at a performance hit.. The latest games that use DX9 features are what we are talking about. not 16x antialiasing.. so what if its 3 fold qucker at making the graphics blurry?
 
Ok, prices first of all:

TNT2 Ultra, I bought one the DAY they came out. Cost: $250.
Voodoo2 SLI. Cost: $400
GF3 Launch Cost: $400
GF3 street cost within ONE month, when I bought mine: $300.
GF4 Ti4600 Launch cost (IIRC): $420
9700 Pro Launch cost ($400)

Prices have gone up, but they're not too crazy, given the amount of DDR ram/etc on the latest and greatest. DDR2 is expensive.
 
Originally posted by mscsniperx
The 9700 was a huge leap foward..

and my point.. since then, we haven't seen much at all in the way of progress... same basic core pretty much, just pumped up speed..



Where do you see that?? And to say 16x antialiasing gives a 300% increase in performance therefore the card is 300% quicker dosen't follow practical use.. No one runs 16x antialising unless you can live with 30 fps on modern games.. Even so, your talking about ONE function that frankly produces very little if not worse graphics at a performance hit.. The latest games that use DX9 features are what we are talking about. not 16x antialiasing.. so what if its 3 fold qucker at making the graphics blurry?

Do you even USE AA, or are you making a guess based of the technique used by modern cards for doing AA (super / multi sampling)?

If you used it, you'd know that the human eye interprets the blurry edge to be smoother than the jagged edge, since it is used to seeing things not in perfect focus. It decieves your eye into believing that it is smooth, and it works. Things are smoother. You blend the jagged edge together. The central portion of the image is untouched, only the edge. So, it looks cleaner...

Oh, and the performance drop isn't nearly what you think it is. I don't know what program you're running, but I haven't had any problems running everything I want to at 4xAA, except for Far Cry. Everything else so far can be made to run there with a little tweaking and working. And AF is even better.

We're not trying to get a More's law exponential curve here any more. Instead, they are adding features to make things look more life like and appropriate, which is what AA and AF do.
 
Okay here are my prices.

Asus Geforce 2 gts deluxe 400$

Asus Geforce 3 ti-500 pure price new 450$ but bought it used for less

Asus Geforce 4 ti-4600 450$ or something.

So prices haven´t really gone up if you look at the amount of RAM on this videocards.
 
Moore's Law isn't applicable to GPUs, but why do we need to double the power every 18 months or so? Nothing out there right now demands something greater than 9700 pro to merely run, when someone buys a 9800XT over the 9700 or 9800 pro, they're paying for a little more AA/AF and a little more res, if they can even run it without slowdown
 
Originally posted by MemoryInAGarden
Moore's Law isn't applicable to GPUs, but why do we need to double the power every 18 months or so? Nothing out there right now demands something greater than 9700 pro to merely run, when someone buys a 9800XT over the 9700 or 9800 pro, they're paying for a little more AA/AF and a little more res, if they can even run it without slowdown

I wouldn´t say no to a guaranteed 100 % improvement in performance every 18 months or so but it´s to much to ask really.

It would require video card makers to completely redesign their video cards constantly and that would be far to expensive. Not to say you can´t fit those mega heatsinks you got on CPU:s over your fragile AGP cards without breaking them ;)

But as long as video cards aren´t available to render totally photorealistic textures in real time there is always room for improvement :)
 
In the original post you say 9800xt compared to 9700 pro, but then your 12% increase claim is 9800 pro from 9700 pro, or at least thats what it says. from a 9700 pro to 9800 xt there is more than a 12% gain
 
Originally posted by mscsniperx
"Graphics cards have never been under Moore's law. It only applies to CPU's."

That was my point, or more to the fact.. we are paying 500$ to gain only 12% increase in performance over a product life cycle of 18 months for Graphics processors..

$500 fo ony 12% ! Very sad...


Im interested in knowing WHY GPU's fail so far short of performance gains compared to CPU's .. and for that matter even other PC components which seem to follow or surpass Moores law..

why must they adhear to moore's law?

who says they have to?
 
Moore's law, pffft. I wasn't aware that nVidia, ATi, etc. had to follow this. Maybe you better call them up and let them know.
 
Well Here is the issues which concern me...Comparing a cpu to a video card...Video cards are more then simply a gpu. Ram prices do not always go down in price etc... Moore law(more an observation and originally every two years) is not always a smooth curve, but often a stair stepping pattern.

>>
9700pro which cost 400$ in 2002
9800pro which cost 500$ in 2004
<<

Another way to consider the issue is that for about 200 dollars today one can purchase a 9700 pro...I have not notice many 9800 pros over 300-400, also it was release quite some time ago. The main issue is the slow memory bandwidth increases...this is likely due to memory makers focusing not only on speed, but capacity.

With memory moving from 620>>680>>720 it is not surprising one would mention 12-13% gains.
 
Where do you see that?? And to say 16x antialiasing gives a 300% increase in performance therefore the card is 300% quicker dosen't follow practical use.. No one runs 16x antialising unless you can live with 30 fps on modern games.. Even so, your talking about ONE function that frankly produces very little if not worse graphics at a performance hit.. The latest games that use DX9 features are what we are talking about. not 16x antialiasing.. so what if its 3 fold qucker at making the graphics blurry?

It's pretty obvious you don't know what you're talking about, it's anisotropic filtering, which anyone with a 9700 up card will use. 4x aa and 16x aniso is THE ONLY WAY to properly stress one of these new cards as otherwise, they are cpu limited in the vast majority of games. The jump from the 8500 to 9700pro was way more than 100% in STRESSFULL SITUATIONS WHERE THE CPU WASN'T A LIMITATION. The jump from the 9700Pro to the r420 will likely also be at least 100% in STRESSFULL SITUATIONS, and as it is going to be right around 18 months, I don't see where your theory that gpu's are stagnating is coming from. There has ALWAYS been incremental increases in between generations, and if you're upgrading from a 9700pro to a 9800xt you're a moron anyway and deserve to be out the $500. The 9700pro to r420 is a different ballgame, but if you look at what was available 18 months ago (ti4600), the 9800xt is way more than a 100% increase. Take any recent card and look back at what was available 18 months before it and it will be 100% faster or more. You also have to figure that what was available 18 months ago couldn't even do a lot of the things that today's cards do (ps and vs 2.0 etc). Take a 9800xt and the top card available 18 months ago in far cry or halo and tell me the increase hasn't been a minimum 100%. You're looking at it the wrong way, you can be pretty certain that 18 months from now, the speed will have doubled, and the speed now has doubled since 18 months ago, but trying to narrow it down to less than that is not the point of Moore's law. You can say that since the 9800XT was released that there has been 0% increase in gpu power so that the market has completely stagnated and Moore's law was a complete crock, but then you'd be totally missing the point (which you obviously are). The increase isn't split evenly over 18 months, ie you're not going to get a 6% speed increase per month and it's pretty foolish to expect that or to blame Moore for your rediculous expecations (or the vid card makers for not meeting them). Look at it in 18 month blocks and nothing less than that and you'll see that Moore's law has been a pretty good predictor (even in areas it hasn't been intended to be).
 
Moores law will eventually fail anyways. All exponentials must fail eventually its the way things work.
 
Low end has becoming cheaper and *FAR* more progressive than high end. I remember drooling over a 200+ dollar voodoo 2, that was it. There was no alternative.

Then the voodoo 3 rolls around in models 2,3,and 3.5. Retail price for low end was 124 at best buy if i remember correctly. (i bought it the first week it hit the streets)

Fast forward a bit, geforce 2, mx's starting price? around 100. Then the next big cheapie card was the gf4 mx, rad 9k, this area is a little differant because many more manufacturers enter the mix. Those all came in at 90-120 depending on configuration. Newest low end cards? Geforce fx 5200, starting price 75-100, 9100 , 80ish, and the 9200 its below 75 cheap. (50+ at start i think)

See how the price of last gen tech steadily keeps droping? The low end is like a mirror image of the high end. This is where everyone makes the most money.

Most successful budget card id have to give to the gf2mx, that card should be in the budget hall of fame or something.
 
Originally posted by Wixard
Low end has becoming cheaper and *FAR* more progressive than high end. I remember drooling over a 200+ dollar voodoo 2, that was it. There was no alternative.

Then the voodoo 3 rolls around in models 2,3,and 3.5. Retail price for low end was 124 at best buy if i remember correctly. (i bought it the first week it hit the streets)

Fast forward a bit, geforce 2, mx's starting price? around 100. Then the next big cheapie card was the gf4 mx, rad 9k, this area is a little differant because many more manufacturers enter the mix. Those all came in at 90-120 depending on configuration. Newest low end cards? Geforce fx 5200, starting price 75-100, 9100 , 80ish, and the 9200 its below 75 cheap. (50+ at start i think)

See how the price of last gen tech steadily keeps droping? The low end is like a mirror image of the high end. This is where everyone makes the most money.

Most successful budget card id have to give to the gf2mx, that card should be in the budget hall of fame or something.

But how much of a difference was there between the
voodoo3 2000 and the voodoo3 3500? No how much of a difference is there between a Radeon 9000 and a Radeon 9800? An even better example would be a Geforce4 4200 and a Geforce4 MX? The budget cards are becoming worse and worse.
 
Originally posted by oqvist
Inflation is more than 1-2 %. But if there is such a difference in performance between 9600 and 9800 why would they cost the same ;)

Look at AMD. I bet they lost a lot of money on the 2500+ since it was so overclockable everybody got that and overclocked it to 3200+ speeds. Now they have locked the multiplier to limit the overclockability somewhat but it´s still a steal.

I don´t see what we should be pissed about? Todays video card have at least 128 and in term of highend 256 mb on board RAM. That has to cost some extra wouldn´t it?? In fact that explains why the video cards today are max 10 % more expensive than in the past. And since you noted that there isn´t much difference in speed between a 9800XT and 9800 PRO... A 9800 PRO can be had for what 300$???

In that point of view it has gotten cheaper with video cards today than in the past.


Then why don't they just prevent their cards from overclocking? They could make a 9800 that was the same as the 9800xt only with a slower clockspeed. When they came out with the 9800pro what did they do? They canceled the 9700. Normally the 9700 would have become the mid range card but instead they cancel it. They come out with another POS and call it the 9600. and this time the 9600 is actually slower then the 9500. HAHA I bet they are laughing all the way to the bank on that one.


Dude... if inflation was higher then 2% then rich people would be freaking out and CNBC would be reporting about it. Also the Federal interest rate would be way higher than it is.
If you want to play around with some inflation numbers check out http://www.westegg.com/inflation/
 
Originally posted by mscsniperx
Consider this...

ATI's R300 (9700 pro) debuted in July 2002...

The 9800XT is currently the fastest card on the market as of February 2004

The 9800 Pro has been shown through real world benchmarks to roughly be only 12% faster then 9700 pro [SOURCE: TOMSHARDWARE.COM BENCHMARKS].. (these figures change when doing antialising)... But antialiasing is a feature rarely if ever used by the end user.. the quality gains are minimal compared to performance loss..


Do the math... 12%/(19months) = 7.6%/YEAR Performance gain for Video Cards..

Moore's law = 100%/18 months = 67%/YEAR
(CPU’s have met or exceeded Moore's law to date)

so you're trying to extrapolate a cuve from two points? :rolleyes: we've got a winner here folks...

consider this:

ati's radeon 8500 was released in october of 2001. in q3 arena, it got 39 frames per second at 1024x768, 4xaa, no af

http://www.anandtech.com/video/showdoc.html?i=1558&p=8

ati's radeon 9800 pro was released in march of 2003, about 17 months later. in q3 arena, it got 117 fps at 1600x1200, 4xaa, 8xaf

http://www.anandtech.com/video/showdoc.html?i=1794&p=9

even if you consider cpu scaling, it's a pretty massive increase in performance given that the resolution increased and af was turned on.
 
Originally posted by TMCM
Then why don't they just prevent their cards from overclocking? They could make a 9800 that was the same as the 9800xt only with a slower clockspeed. When they came out with the 9800pro what did they do? They canceled the 9700. Normally the 9700 would have become the mid range card but instead they cancel it. They come out with another POS and call it the 9600. and this time the 9600 is actually slower then the 9500. HAHA I bet they are laughing all the way to the bank on that one.


Dude... if inflation was higher then 2% then rich people would be freaking out and CNBC would be reporting about it. Also the Federal interest rate would be way higher than it is.
If you want to play around with some inflation numbers check out http://www.westegg.com/inflation/

So video card makers shouldn´t earn money :rolleyes:

No they can´t disallow overclocking completely since then they would loose a lot of customers.

They went with the 9600 since it´s probably cheaper to produce and wasn´t to fast to compete with the high end cards. They all do the same thing it´s up to you if you want to buy it or not.

And believe me in a 8 year span or so we are talking about here the inflation is more than 1-2 %.

I also don´t get why you think you should get more RAM for free?
 
Lots of random points for everyone:

A) Moore's law is more of a marketting plan than anything.

B) Video cards are limited predomintantly by memory bandwidth, which has historicaly had trouble getting advanced as fast as CPU's do (thus long CPU pipelines, lots of CPU cache, lots of prediction etc all to either reduce memory access or take advantage of it efficiently because it is SO slow.

C) Most games are CPU bound since developers tend to have fast machines, plus the CPU is the easiest thing that is predictably possible to take advantage of in the future (i.e. CPU's are faster in the future) but you can't really take advantage of new graphics hardware properly aside from GPU speed and fillrate improvements.

D) I would say it takes 2-5 years for people to get really experienced with PC graphics to really push the GPU efficiently enough (note that this is completely seperate from keeping the game from being CPU bound to begin with). The tools in the PC world suck for figuring out why your graphics are slow during development. Its usally the CPU being overworked, or too many calls to the API, but sometimes its not and its hard to tell without good tools.

E) Rumor has it next generation video cards will have around 50-60 GB/sec bandwidth (about 3x more than we have now). We shall see. Large quantities of state of the art RAM is expensive and the main cost of the video cards . . .

F) AA is going to probably to get brushed aside if most games end up using render-to-texture features, which I suspect they will as more programmers get experience. Those working on XBOX or have been around since DX7 are seeing what it can do and its pretty cheap to get eye candy with this feature once you understand the possibilites. The problem is with the way the hardware (and the API's) are setup, you just can't have an anti-aliased texture that you render into. The 'screen' doesn't count since you can't use it as a texture. Common effects require a copy of the screen as a texture (refractions and reflections, as well as various blurs and other interesting tricks) The refractive camoflauge effect in Halo is a good example. AA settings have no effect on the creation of these textures (aside from a potential out of memory on the video card condition on low memory video cards), which is why if you try to force AA on in a game doing these kinds of things, nothing seems to happen. The game is really being rendered into a screensize texture then copied to the screen, effectively bypassing the AA setting visually. The fact the screen is AA doesnt mean anything, but you will be wasting fillrate and memory having it on at all in this case.
 
Originally posted by Zoner
Lots of random points for everyone:

A) Moore's law is more of a marketting plan than anything.

B) Video cards are limited predomintantly by memory bandwidth, which has historicaly had trouble getting advanced as fast as CPU's do (thus long CPU pipelines, lots of CPU cache, lots of prediction etc all to either reduce memory access or take advantage of it efficiently because it is SO slow.

C) Most games are CPU bound since developers tend to have fast machines, plus the CPU is the easiest thing that is predictably possible to take advantage of in the future (i.e. CPU's are faster in the future) but you can't really take advantage of new graphics hardware properly aside from GPU speed and fillrate improvements.

D) I would say it takes 2-5 years for people to get really experienced with PC graphics to really push the GPU efficiently enough (note that this is completely seperate from keeping the game from being CPU bound to begin with). The tools in the PC world suck for figuring out why your graphics are slow during development. Its usally the CPU being overworked, or too many calls to the API, but sometimes its not and its hard to tell without good tools.

E) Rumor has it next generation video cards will have around 50-60 GB/sec bandwidth (about 3x more than we have now). We shall see. Large quantities of state of the art RAM is expensive and the main cost of the video cards . . .

F) AA is going to probably to get brushed aside if most games end up using render-to-texture features, which I suspect they will as more programmers get experience. Those working on XBOX or have been around since DX7 are seeing what it can do and its pretty cheap to get eye candy with this feature once you understand the possibilites. The problem is with the way the hardware (and the API's) are setup, you just can't have an anti-aliased texture that you render into. The 'screen' doesn't count since you can't use it as a texture. Common effects require a copy of the screen as a texture (refractions and reflections, as well as various blurs and other interesting tricks) The refractive camoflauge effect in Halo is a good example. AA settings have no effect on the creation of these textures (aside from a potential out of memory on the video card condition on low memory video cards), which is why if you try to force AA on in a game doing these kinds of things, nothing seems to happen. The game is really being rendered into a screensize texture then copied to the screen, effectively bypassing the AA setting visually. The fact the screen is AA doesnt mean anything, but you will be wasting fillrate and memory having it on at all in this case.

You got points :)

But about AA I have heard they where able to fix it in 3DMARK 2003 so maybe the hope for AA isn´t completely gone. Even though I have noticed that AA works on less and less games out there who uses dx 9.
 
oh boy... this thread is a mess.... first of all Moore's law is not a LAW but a THEORY ... Moore isn't going to come and punish graphics card companies for not following his theory .... he based it on trends in the industry during his time ... times have changed ... i think people should really stop trying to apply this ancient theory to modern times ... don't you think graphics chipset companies are driven to have the fastest product so they can have a greater market share amongst enthusiasts ... if they could develop gpus any faster they would

however i do agree that its not worth it to upgrade as often anymore... there have been no huge breakthroughs since r300

and believe me ... i don't know too much about Moore's law but think that any person with common sense can see that this argument is moot
 
Back
Top