Where does it end? A rant about Cost and GFX!

Majin

2[H]4U
Joined
Feb 4, 2005
Messages
2,477
Where does it End?
The advent of nVidia’s SLI and ATI’s (finally worth buying) Crossfire technology has been a great thing for computer enthusiasts and above all else gamers. It allows you to simply upgrade your computer without having to spend huge amounts of money for the next best GFX card (which will just be surpassed in 3 months any way). At the top end it allows people with some money to spend to have bleeding edge graphics and keep their expensive gaming rigs up to date. Over all a really great thing; but it’s getting out of control in my opinion.

3 PCI-e slots, 3 video cards… nVidia’s way to not only increase their bottom line but to solve a problem of theirs, not being able to run dual or multi Monitor in SLI and on top of all that I am sure it makes all the other OEM’s very happy. Why? 3 Video cards means upgrades! More cards mean more power is needed, which means a new PSU, how about 1200 watts, more cards mean more heat is created which means more fans or water cooling is needed. Not to mention the fact that just to be able to use 3 cards means you need to upgrade your Motherboard to a 780i board and since you have this new MB you might want to upgrade your ram to PC 8500 to take advantage of the new board.

I’m not an idiot I realize no one is forcing me to buy this stuff, and I know that if I feel like spending $8,000 I can get a really awesome computer from www.Maingear.com that will be able to play Crysis. But that’s the problem! The mentality of these OEM’s is that let’s keep making it more and more expensive, let’s keep putting out video cards with a MSRP or $1000.00+ and someone will buy it. Why is that a problem, because Inflation works it way down from the top! Before long buying a budget card that can barely run today’s games will have a MSRP of $300.00 to start and just go up from there.

10 years ago $300.00 would get you TOP of the Line! In video cards. Again I am not an Idiot I know that times have changed and games have become way more graphically intense and standards have been improved and this facilitates the need for more advanced graphics cards and as the cards get more advanced the cost goes up to make them. But are we being taken advantage of? YES! There is zero reason that anyone should ever need 3 GFX cards… Instead why not improve the cards to give us the same power we would get from 3 cards. HOLD THE FLAMES! I know they are doing this, I know you can get the new cards that are SLI in 1 card and I am happy to see it. But they still came out with 3 PCI-e MB’s as well.

nVidia and ATI need to step back for a second and realize that the future of computer graphics is not “Let’s create Quad SLI and the Motherboards to run 4 cards.” it’s “Let’s create a single card that will give the user the power of 4 GPUs” Also let’s make it stable and have a acceptable heat output and over all let’s sell it for a reasonable price to the consumers!” The bang for the buck is just not there, for $1000 you can get a very good complete computer from Dell.com or you can get an 8800 Ultra OC GFX card. Getting a great computer that will stand up to par for longer then 1 year shouldn’t mean having to sell your car to buy it!

Take from this what you want, Flame me, Agree with me, or Ignore me, this has all just been MPO.
 
Although I agree that some hardware reaches insane prices (high end GPUs and CPUs), you forgot to mention cards like the 8800 GTS 320, 8800 GT, 9600 GT, HD 3850 and HD 3870.

They were/are all excellent bang for your buck, since for about $170-$220, you get very powerful graphics cards. In the previous generation, you had the 7900 GS and X1950 Pro as best bang for your buck aswell.

Technology is always evolving. I'm not particularly fond of SLI or Crossfire (in either configuration), but it is a means to get to a situation in the future, where a single graphics card will be able to run as fast as the SLI or Crossfire setup. I prefer to wait for that day, but there are those that do not want to wait and they are the ones that NVIDIA and ATI are appealing to, with SLI and Crossfire.
 
I spent a bit under 2500$ and got everything for my new computer from a 24" 245BW Beautiful LCD screen, 780i 2 8800 GTS 512s, 4 gigs ram, a copy of Vista, a copy of XP, a copy of MS office, a bunch of games, HD fans, aftermarket coolers, case, etc. Probably 500$ of that was just software and aftermarket cooling that I could have lived w/o. People who bought a 8800 GTX 18 months ago now when it first came out will still have what is 90% of the fastest single GPU solution on the market until the 9800 GTX comes out. I went over the edge a bit on the price/preformance curve. But I'm playing CoD4 at max settings and max AF and still getting 60+ FPS at 1920x1200. I havn't regreted it for a second.

Crysis was a game that was designed to look good after the next gen hardware comes out. And that won't be until the Q3/Q4 gen hardware.

As for the 8000$ PC. You can build a 25000$ computer at apple, or a 28000$ one at dell, or you could build a monster for 2K your self.
 
We, consumers, have to constantly dump $100's of dollars into a PC just to be able to play a game the way it should look/play when a $350 console unit plays the same game just as good if not better. The graphics manufacturers ARE ripping us off. There is no freakin reason someone should have to put 2, 3, 4+ video cards in a damn computer in order to play a game just as good as a $350 console other than for the graphics manufacturers profit goals. It's getting friggin rediculous. Where will it end? In 5 years are we going to see motherboards that are 3 feet long in order to accomodate the 10 graphics cards it requires to play COD5? LOL. At least the XBOX360, that's been out for 2 years now, STILL plays every game that comes out for it just as good as the first game. And it'll do it at 1920x1080 without breaking a sweat or having to dump another $3-500 into it every year on new hardware.

Maybe consumers should stop fueling nVidia's and ATI's greed, stop buying into their bullshit excuses and corporate marketing schemes. Stop believing that in order to play this new game you need to buy $1000 worth of video cards. Just stop buying their bullshit, maybe then, maybe they'll start making single products that actually perform on TODAY's games, not help you play yesterdays games the way they should have been played. Some will blame the game developers for not optimizing for todays hardware. Bogus. If that were the case we WOULD need to upgrade our XBOX's.

I know bitching about it won't change a thing. It just sucks is all and I'm venting. LOL Graphics corporations suck for the fact that they have the control over how we interact with our equipment and software. It's like going to an auto dealer and them selling you half a car and the salesman saying "Oh...you actually want to drive it??? LOL .. you're going to need to buy a second car for that". That shit would never fly.

Just my $0.02
 
I have a hard time taking what you say at face value when you have two 8800GTX cards in SLI, and two raptors. The reason they do it....because you will pay for it. There really is no other reason...and that is the reason business are in business.
 
As long as people buy it, Nvidia and ATI will sell it. I doubt very seriously you could get the majority of PC enthusiasts to agree to stop buying SLI or Crossfire setups just to prod Nvidia or ATI to make single card solutions that are as fast as or faster than a multi-GPU setup.
 
PC gaming is dead to me, for the same reasons above. People can say what you want, but yea the fact is simple, you can break the bank, and still can't play some game, hello crysis. IMO its not worth it, and developers and companies need to optimize their engines better. If it runs fine on a 7800 GTX (PS3 RSX) and a Suped up X1800XT (Xbox 360), then it should run comparably in our pcs, even at low resolutions, but it doesn't, something is wrong. I am personally more interested in multi cards, for display reasons, and HD playback, but computer gaming, personally, I am not surprised its dieng out. I don't mind them making SLI, etc, but its not a great value.
 
Very interesting topic thus far.
I don't think we can blame nVidia or ATi, I think both of them know that they should be working on a GPU with the performance of four, not working into having four GPUs together, etc. But it's easier said than done, and this is business. It's a competition, and they're just out to make money. There will always people who will want the best of the best, hence extreme editions, etc. and the fact that people will buy these GPUs with inflated prices just so they can have the best means more $$$ for these businesses. Also, the fact that I can get an 8800GT and have it peform within a few percent of nVidia's flagship for nearly half the price makes the 8800GT that much more appealing.
 
$290 got me a 8800GTS 512, which is top of the line for my resolution. Total cost of my rig was just under $800 after Windows Vista. Seems like a pretty good deal to me. I remember spending $450 for my 9800Pro 256mb. Seems like its getting cheaper to me...
 
I have a hard time taking what you say at face value when you have two 8800GTX cards in SLI, and two raptors. The reason they do it....because you will pay for it. There really is no other reason...and that is the reason business are in business.

Oh Don't get me wrong, I am a person that If I had more money I would have a $20k computer and would upgrade it every year.

I made the choice to spend $1400 on 2 cards and at the time a $500 CPU and all the rest, I do these upgrades about every few years.
I got sick of having to replace my GFX Card every 1.5 years just so I could play the next best game, so I spent the money and went SLI and I haven't looked back, it's Awesome being able to play everything (minus Crysis) on Max Settings.

The Main body of my rant is that 2 cards makes sense, you buy 1 card and as games get to be tooo much for that card, you buy another for less then buying a top end card. But creating a plateform that uses 3 cards is just greed and an example of the consumer being taken advantage of!
 
Prices arent getting worse. And this last gen was a relative bargain compared to past generations.

You also need to take into account monetary inflation since 10yrs ago assuming 2.5 to 3% inflation then $500 ten years ago is around $650 now. So prices are still pretty much inline with what they have been. Remember the GF2 Ultra. That was $500 when it came out back around 2000 or so.

The main difference now is you have the option to spend much more to get increasingly incremental gains. Which is nice if you have lots and lots of money laying around and want the best performance possible.
 
Although I agree that some hardware reaches insane prices (high end GPUs and CPUs), you forgot to mention cards like the 8800 GTS 320, 8800 GT, 9600 GT, HD 3850 and HD 3870.


I totally agree with you. It seems like this generation has some of the best bang for your buck cards in recent memory. There's always a huge leap in price from a high-end card or CPU to the top card/CPU. It's easy these days to build a damn good gaming PC for right around $1000. It won't beat any speed records but it'll play just about any game out there (outside of Crysis) well enough to not hinder gameplay experience.

I've had my current rig for almost a year now and it still handles everything I play regularly at 1600x1200 with max or close to max settings, besides Crysis. I don't see myself upgrading unless I get a new, higher res display or my 8800GTS becomes unable to keep up with the latest games and becomes a problem when trying to turn on eye-candy.
 
PC gaming is dead to me, for the same reasons above. People can say what you want, but yea the fact is simple, you can break the bank, and still can't play some game, hello crysis.

Crysis is the only game remotely like that - I'm playing COD4 with high settings right now with an x1800xt 256mb, 2gb RAM, and a 3800+ x2. In total, my system might have cost me $500 - and it can do all kinds of crap that consoles can't!
 
Crysis is the only game remotely like that - I'm playing COD4 with high settings right now with an x1800xt 256mb, 2gb RAM, and a 3800+ x2. In total, my system might have cost me $500 - and it can do all kinds of crap that consoles can't!

You have a similar rig to me, except I spent over 2k at the time, because I bought it during the high end period. Regardless I won't be dropping this much money just to play games anymore, difference isn't that great, plus old school gaming > any new stuff. Ill still drop alot of money on graphics cards, because of their display abilities, but if I used my previous mindset, I would have 3 8800 Ultras in Tri SLI right now, and be paying that off the next 2 years, but honestly, doesn't matter much to me anymore. My PS3 gets some love when I want to play, and thats rarely. Only game I can say I am looking forward to, on PC is Star Craft 2. Even shooters, Half Life 2, being awesome, I am done with, because I hate steam.
 
10 years ago, $300 was equal to $600, and the minimum pay was also very low. Times change, either change with them or get left behind.
 
We, consumers, have to constantly dump $100's of dollars into a PC just to be able to play a game the way it should look/play when a $350 console unit plays the same game just as good if not better. The graphics manufacturers ARE ripping us off. There is no freakin reason someone should have to put 2, 3, 4+ video cards in a damn computer in order to play a game just as good as a $350 console other than for the graphics manufacturers profit goals. It's getting friggin rediculous. Where will it end? In 5 years are we going to see motherboards that are 3 feet long in order to accomodate the 10 graphics cards it requires to play COD5? LOL. At least the XBOX360, that's been out for 2 years now, STILL plays every game that comes out for it just as good as the first game. And it'll do it at 1920x1080 without breaking a sweat or having to dump another $3-500 into it every year on new hardware.

Maybe consumers should stop fueling nVidia's and ATI's greed, stop buying into their bullshit excuses and corporate marketing schemes. Stop believing that in order to play this new game you need to buy $1000 worth of video cards. Just stop buying their bullshit, maybe then, maybe they'll start making single products that actually perform on TODAY's games, not help you play yesterdays games the way they should have been played. Some will blame the game developers for not optimizing for todays hardware. Bogus. If that were the case we WOULD need to upgrade our XBOX's.

I know bitching about it won't change a thing. It just sucks is all and I'm venting. LOL Graphics corporations suck for the fact that they have the control over how we interact with our equipment and software. It's like going to an auto dealer and them selling you half a car and the salesman saying "Oh...you actually want to drive it??? LOL .. you're going to need to buy a second car for that". That shit would never fly.

Just my $0.02

You sir are simply wrong about a lot of this. First off, there's really no game out for a PC AND a console (thus this doesn't count Crysis) that won't run just as well on a $1000 PC (all costs included) as that $350 dollar console. So yes, that PC costs a lot more, and is a hell of lot more than a game machine.

PC gaming hardware is more expensive by good bit, but the hardware is not subsidized like it is for consoles. Secondly, hardware vendors are not making big margins on the console hardware. That’s where those exotic solutions like SLI and Crossfire allow the hardware makers to make some real cash.

I don’t know why people see such a huge chasm between consoles and PC’s. It’s the same software developers and the same hardware makers simply targeting the various gaming markets out there.
 
I think where you stray in this situation is the idea that you are being somehow forced to participate in any of this.

If you wish to play a game on a 24" panel with high settings, then yes, you're going to have to buy an SLI setup to support that.

However, plenty of people get away playing on smaller screens and single card solutions.

As long as people are buying the technology at a given price it will continue to stay there or go up marginally over time.

If you want to exact change on this scenario I suggest you get a whole mess of people to vote with their wallets.

Your console argument also holds little weight as well. As stated, the console is being subsidized. Also, consoles normally sell at decent losses in the beginning because the company will make that up as the production becomes cheaper and components become more readily available.

It's just free market economics at work, that's all. :)
 
Exactly. To keep costs in line, use a smaller monitor. It doesn't require the horsepower a larger monitor would require to run native resolution.
 
13 years ago, my 486DX rig with integrated graphics (?) cost $3800.

A few years after that, a 8MB Matrox card got framerates of 5 fps on the games back then.

Skipping advances in CPU technology, quite a few years after, an Ati Radeon 9700 got 20 fps on the games back then and cost ~300. That system cost about ~2500.

A few years after, A 6800GT cost about 260 and got 30 fps on games back then. System = ~1800

The new rig I'm building has a 8800GTS 512 for ~200 and is expected to run modern games, excluding Crysis, at ~60 fps. Current system cost, for an SFF, is 1100.

Meanwhile, console prices have increased quite dramatically, from the original Nintendo system I had back then to (IMO) outrageously expensive PS3's. Meanwhile we have annual inflation of 2-3%.

Where, exactly, has competition in the PC market diminished?
 
13 years ago, my 486DX rig with integrated graphics (?) cost $3800.

A few years after that, a 8MB Matrox card got framerates of 5 fps on the games back then.

Skipping advances in CPU technology, quite a few years after, an Ati Radeon 9700 got 20 fps on the games back then and cost ~300. That system cost about ~2500.

A few years after, A 6800GT cost about 260 and got 30 fps on games back then. System = ~1800

The new rig I'm building has a 8800GTS 512 for ~200 and is expected to run modern games, excluding Crysis, at ~60 fps. Current system cost, for an SFF, is 1100.

Meanwhile, console prices have increased quite dramatically, from the original Nintendo system I had back then to (IMO) outrageously expensive PS3's. Meanwhile we have annual inflation of 2-3%.

Where, exactly, has competition in the PC market diminished?

Just for some perspective:

1 GeForce 9800GX2 = ~1 teraflops > all the computers that ever existed from ENIAC in 1943 to 1991 combined.

That's trillions of dollars of R&D and brute computing horsepower you're paying for.
 
On that topic .. Cost of computational power: http://en.wikipedia.org/wiki/Teraflop

1961 - US$1,100 per FLOPS or $1.1E12 per GFLOP
2007, October - $0.20 per GFLOP (PS3)

There was an analogy that if automobiles experienced the same evolution, they would today have 1 million mpg, weigh grams, and have power measured in the mega-horsepower range. Lol.
 
I remember in the goold old days of incremental video card upgrades... yea you literally got nothing in the early development periods of 2D cards, when the 3D add-on cards came from 3dfx the honestly didn't give us much compared to what we are used to today.

When we started getting into the realms of 3d we got access to a small amount of texture filtering on textures that really didn't get much use out of it if any at all, blury AA, and a one up on the resolution per increment, hell I had to go to SLI'd Voodoo2's 12meg just to get Quake2 to run 1024x768 just "ok" with glide on.

We move today with hardware not exactly skyrocketing up and looking at the differences between the 7950GX2 and the 8800GTX, I'd say we are definetly getting more for our money then back then.

Don't forget the very VERY minimal increases we were getting back before the GF4 was released, compared to now we are getting huge bargins, so be glad for the great progress they have been making and giving us products that give us a price range in anyplace we could ever want!
 
The OP isn't arguing computers, gotten better, or cheaper, so all these replies mean nothing, hes saying its getting ridiculous the amount of money we have to spend, to follow the trend, based on what companies want you to pay for to stay on the high end. Sure there are good options and bang for buck, but still doesn't justify tri sli. Obviously OP wants and wishes to use tri sli, just like me, but the benefits just aren't there. And all these arguments bang for buck Consoles > PC, any day in bang for buck. Computer > Consoles, because they are cooler.
 
If we take a median sample of, say all games released within a year, I would expect that the sample would would be better performing based on absolute fps at a cheaper hardware price than a similar sample done, say 5 years ago. Perhaps, then, our standards have changed since the days of Voodoo 2? That said, what is "affordable" is an intensely personal decision.
 
I don't think your argument is very valid. nvidia and ati continue to push these boundaries because people buy it. *BUT* developers know that this is far from common and don't target those audiences specifically, so $300 can get you an 8800GTS 512mb, which, lets face it, is basically top of the line. Sure, you can pair multiple of them, but it is still top of the line in single card form and will dominate all the games out there (except Crysis - which nothing dominates). nvidia and ati will stop with these crazy multi-GPU setups as soon as they stop being profitable (aka, not any time soon). OEMs ripping off consumers is nothing new. Take Apple, for example, where apparently 2x2gb of DDR2 667 costs $500 (despite newegg selling it for what, $50?).

And to those people who think consoles are cheaper, I beg to differ. Think about it, you spend $400-500 on a console, but you are still going to have a $600-700 computer (minimum) too. The $400-500 that you would have spent on a console could be added to the computer budget giving you $1000-1200 to play with - easily enough for a C2D + 8800GT which will tear up most games out there at max quality which will look better than what consoles can push now, much less in 2 years time (roughly half the console's lifespan) when the console is dated. Of course, all the PC games are on average $10-20 cheaper, so there is long term savings there too
 
The OP isn't arguing computers, gotten better, or cheaper, so all these replies mean nothing, hes saying its getting ridiculous the amount of money we have to spend, to follow the trend, based on what companies want you to pay for to stay on the high end.

So, to the OP, either get a job, or I don't know, get a library card (those are free!) and do some reading.
 
Exactly - a PC can replace a console .. but not the other way around. And the "average" developer will of course target the largest audience, and in that respect the hardware costs for the largest audience have been on an unstoppable decline.
 
I loved the days when I could buy a low-end/mid-end card then overclock/unlock it to a $300-$400 dollar card. :p

All of these were release prices:

GeForce Ti 4200 O/C to Ti 4600 (forgot price, I think ~ $150)
FX 5200 - O/C to FX 5600 Ultra ($70, lulz)
Unlock 9500n/p to 9700Pro (sub-$150, I think)
Unlock 6200 to 6600GT or better (I got mine up to 600/900 with the help of potimeters :eek:. Plus I the card for ~$30 to boot!)

Damn I miss those days.

BTW you people can't compare the price of consoles to PCs because console makers take significant losses, while graphics card makers are in it to make money. Well... except for the Wii (GC 1.5). The people buying that thing are the real ones getting bum raped. It's basically a overclocked GameCube.:eek: They probably make $210 on each console sold. I still don't understand why they couldn't just add a add-on to the sub-$80 GameCube and be done with it, since that's basically what the Wii is.

BTW the 8600GTS/2600XT pissed me off. The mid-range was rockin' before they ruined it with those monstrosities.
 
We, consumers, have to constantly dump $100's of dollars into a PC just to be able to play a game the way it should look/play when a $350 console unit plays the same game just as good if not better. The graphics manufacturers ARE ripping us off. There is no freakin reason someone should have to put 2, 3, 4+ video cards in a damn computer in order to play a game just as good as a $350 console other than for the graphics manufacturers profit goals. It's getting friggin rediculous. Where will it end? In 5 years are we going to see motherboards that are 3 feet long in order to accomodate the 10 graphics cards it requires to play COD5? LOL. At least the XBOX360, that's been out for 2 years now, STILL plays every game that comes out for it just as good as the first game. And it'll do it at 1920x1080 without breaking a sweat or having to dump another $3-500 into it every year on new hardware.

NO! Consoles do not play games as good as a high end PC, or even close. For one, to say consoles "do it at 1920X1080 without breaking a sweat" is blatantly false. Very few console games actually run at that in native resolution. Sure, the console outputs that to your TV, but the actual game is running in much lower resolution and just upscaled. Most 360 games are running 1280X720 and some are even lower than that. Games like Project Gotham 3 and Perfect Dark 0 ran with 540 lines of horizontal resolution.

Consoles can get away with the lower resolutions because you're sitting 6-10 feet away from the TV. However, it doesn't change the fact that PCs offer a much more detailed, higher resolution image.

To put things in perspective: my Xbox 360 was $400, and it ran Oblivion at a true resolution of 1280x720 (to run it 1080i just upscales that low resolution image). It had a fair amount of graphical features turned on in the way of lighting and HDR, but a lot of the settings were equivalent to medium-low on the PC. It did all of this at 30FPS tops, with frequent slowdowns outdoors.

Later that summer when the C2D's came out, I built a system with a E6400 and a 512MB X1900XT for around $1000. That system handled the game at pretty much max settings, with some settings pushed beyond max through the ini file. HDR, 4X AA, 1680X1050 res. Much much better draw distance than the 360, and the content also looked a lot better because the PC version could accomodate ultra high resolution user-made textures.

So less than a year after the 360 came out, a $1000 upper-mid range system absolutely destroyed it. Now I've upgraded my PC to SLI G92 8800 GTS cards. This ended up costing me about less than $400 when factoring in the sale of my old mobo/gfx card. My system now has roughly 3-4X the raw graphical power of a 360.

Now, if you're happy playing low resolution PC ports that can't be modded and often have settings comparable to "medium" on the PC versions, that is fine. A console is good at doing that. And again, if you don't care about gaming at 60+ FPS and are fine with having your games locked at 30 FPS in many cases, again consoles have you covered. For me though, I'm busy and my gaming time is precious, so I want the best possible experience when I game and I don't mind paying for it.

If I was fine with playing games with gimped settings and substandard resolution/framerates like many 360 games, I could have kept my X1900XT and it would still handle today's games better than the 360. The only games that are even arguably better on consoles are the crap UBI games like R6Vegas that were developed for the console and then ported to the PC.
 
Howdy!

It seems that some here have missed out on a minor point. A console has a fixed configuration compared to a computer and a far simpler OS. When a game is made for a console, the programmers stick to what they have to work with and create it to fit. There's no point in exceeding the limits of a console because there will be no hardware upgrades, no tweeks added to it. Plus, who would want a game for a console with framerates like slide shows?

Presently there are two major CPU suppliers, two major GPU suppliers and several chipset suppliers to take into consideration. All with their own 'standards' of gaming tweaks embedded. Newer GPU's, unlike older, supporting parts or the whole of DirectX 10 functionality.

Programmers for computer games face a different problem than with consoles. Like for instance DirectX 10, being only released on Vista. How many have yet upgraded to Vista? They want to use ever increasing available resolutions, multi level animated backgrounds, visual quirks to create an increasingly good looking game for those willing to pay. And at the same time be able to support more common graphics and computing power.

Any more questions?

Your's, (-:b
 
Also, "Inflation works it's way down from the top" isn't 100% accurate. Inflation is a constant. 1 dollar a year ago will get you 98 cents this year. 100 dollars a year ago will get you 98 dollars this year. Rich people are good for the economy. Think of the realestate investor (current economic situation aside).

Lets say downtown New York is in need of new office building. A realestate investor (like Donald Trump) decides to buy a block of housing/older buildings/etc. tear them down and build a new skyscraper. Just to throw numbers out there he pays 20 million for the original property and 100 million for the new building. That first 20 million dollars went to the original property owners (who probably paid much less for their property) where that 100 million is to pay for the materials and labor to build a new building. By Trump meeting the demand for more office space downtown, he put a considerable chunk of his money back into the economy. His fancy new building is then rented out, and he gets a return on his investment.

How this applies to graphics cards is pretty simple. Nvidia spends hundreds of millions (wasn't the end-total of G80 400 million? I can't cite the source of that one though) to design and produce their latest and greatest graphics card. The street price of this card was 650 dollars, which means companies like BFG are selling it to retailers for 550-600. Nvidia as a company doens't produce their own boards, they sell the silicon and refrence designs to companies like BFG. BFG needs to purchase the RAM, not to mention the cost of manufacturing the boards. If BFG is selling them to retailers for 450-500, I bet their total cost is around 350. My guess is of the total cost of the board the GPU itself is at most half. Meaning 175 dollars. So, using numbers completely pulled from my ass, lets say that Nvidia makes 175 dollars for every G80 gpu they sell. If development did cost 400 million, that means they would need to sell almost 2.3 million GPU's at 175 dollars just to break even. The "trickle down" here is when a G80 core doesn't quite get manufactured correct. And, like Donal Trump pays for other people's wages while he's making money for himself, my point here is that the hi-end graphics cards are what make the mid-level so cheap. Without that cluster of 32 pipelines disabled on the G80, we never would have had the orignial 8800GTS'. Which were considerably more cost-effective. The high end will always help bear the cost for the mid-level. Remember that.
 
I think where you stray in this situation is the idea that you are being somehow forced to participate in any of this.

If you wish to play a game on a 24" panel with high settings, then yes, you're going to have to buy an SLI setup to support that.

However, plenty of people get away playing on smaller screens and single card solutions.

As long as people are buying the technology at a given price it will continue to stay there or go up marginally over time.

If you want to exact change on this scenario I suggest you get a whole mess of people to vote with their wallets.

Your console argument also holds little weight as well. As stated, the console is being subsidized. Also, consoles normally sell at decent losses in the beginning because the company will make that up as the production becomes cheaper and components become more readily available.

It's just free market economics at work, that's all. :)

QFT.
 
I disagree that consoles equal the quality of a $2,000 PC. An economically built PC could easily match the PS3. And the Wii doesn't even come close. Haven't played the X-Box 360, but it doesn't seem to be a whole lot better then my brother's two year old PC, which would probably get a resale value of $400-500. Also PCs can be upgraded.
 
Back
Top