"The market is screaming for a solid third supplier....We are that company"

S3 is planning on taking a big stick to Nvidia and ATI with a card that has higher clocks than anything currently on the market with things such as SM4.0 and lwer wattage per clock than any other card that nvidia or ATI currently have.

And guess what, that card will have 8 pipelines, it's not going to be going against the top of the line
 
furocious said:
About the mobos+integrated video, they don't manufacture the boards. Companies that do make motherboards don't want to add 100-xxx+ to the cost of a board, who wants to buy a 400 dollar card when the midrange performance on-board will be useless in 6 months? Not myself. You can't sell an integrated video device neither.
Yes, I am aware they don't manufacture the motherboards. Just like they don't manufacture the video cards themselves. I merely stated that as in nVidia -> ATI makes these gpu chips, just like they make motherboard chipsets, as well. And when they sell these chips in mass quantity to manufacturers, the cost comes down the more they buy.

Keep in mind, the mass majority of computer buying consumers will buy OEM pre-built pcs, or they turn to a pc guru friend/family member to build them one for cheap. I only say this because my cousin ingorned my heeds and bought an OEM pc with integrated graphics. She called me up one day and asked why her computer wouldn't run The Sims 2. I told her that the problem was the onboard video was incapable of doing so. Luckily, her pc has an AGP slot, so she asked what to get. I made some recommendations from Newegg, but again she went to a best cry b&m and paid $150 for a Radeon 9200. My cousin is not the only consumer out there that doesn't know anything about computers except power button, point, and click. There are millions of them. And millions of computer sales = cheaper parts when manufacturing. If she had to pay an extra $100-$200 over the price of the system anyway because decent integrated graphics were there, then the extra initial cost is justified since she went out and paid that much extra for an add-in anyway.

I can understand your point, but a good integrated video chip will have a much longer useful life span than 6 months for a majority of OEM computer buyers.

The best video I have are 6600GTs in SLI, hardly cutting edge, but it runs every game I've thrown at it just fine. Hell, I'm running a GeForce 4 Ti 4100 in one of my systems and it still does okay with what it's capable of doing.

My point: The 6600GT has been out for way longer than 6 months, and they still do great.

My other point: If there were decent mobile-class gpus on select models of motherboards, they would have a lot longer lifespan than 6 months because the mass majority of users aren't going to run BF2, Q4, D3, ES4, etc at maximum uber-resolution with all the eye candy full on, nor will they upgrade their video cards to the latest and greatest every launch like a very small market percentage of enthusiasts do.

My other other point: I simply would like to see OEMs step up the integrated graphics on their desktops. And I'm keeping my fingers crossed that at some time down the road, this will happen. Be it from ATI, nVidia, or S3.....
 
jebo_4jc said:
I tell you what. You take that attitude, and move to a communist country where prices are set by the government.
I will be here, hoping that a little competition will help bring prices of the rest of the graphics cards industry down, and will encourage further advancement in technology.
OK, so there are three huge graphic card suppliers for the US today. You have ATi, NVIDIA, and Intel. Where's the communism? I have plenty of choices, and when S3 was making cards they were a pretty bad choice when compared to the competition. ATi, NVIDIA, and Intel aren't going anywhere so they will always be an option. S3 can't push any cutting edge technology because they don't have the resources. Unless they can offer cards with the same performance at a lower price than ATi and NVIDIA ( and I know we're talking low end stuff here) then they might be OK. The problem is they don't have the volume that the other companies have to make prices low.
 
illgiveumorality said:
Your assumptions will kill you one day
Assumptions?
The last time I actively cared about S3 was back in 1996, and at the time 3dFX had better cards. In the past few years I've seen their OEM market share shrink because of better solutions from Intel, NVIDIA, and ATi. Unless they can totally turn their business around and make a very good product, then they're not going anywhere.
 
tdg said:
Wrong.
Wrong.
Since your obviously a n00b you probably shouldn't speculate on things such as this.
I didn't speculate anything.
As I said before, back in 1996 other companies had better cards.
Since then in the OEM market, other companies are offering all in one solutions that have more features and better performance. Intel, NVIDIA, and ATi have taken almost all the OEM market share they had. Unless they make a complete turn around they will continue to be a very small player.

If they decide to make some mid to low end cards in the future, then I can only say "Welcome to a world of incompatibility and driver issues" that we've seen with XGI and other small companies.
 
GilmourD said:
I think that's close minded and untrue. The Savage line had a place in the market, and filled that place quite nicely. I sold quite a few Savage cards back then with nary a qualm nor complaint. They did what they advertised. I'm a sucker for nVidia products, and admittedly not a big fan of ATI, but I'm not a !!!!!!.

I hope S3 does well, not only for competition, but because they have indeed had a pretty decent product in the past. They're on the same path as ATI. Let's just hope they can pull it off.
The Savage cards were put to shame by VooDooII and TNT cards. There were also low end versions on TNT cards like the Vanta that filled in the lower price ranges and were still better than the savage.

The only way the can 'pull it off' is if they get very good at making drivers. If they have cheap hardware and stable drivers then they will get some sales...but if they have cheap hardware and crappy drivers they'll tank.
 
"The market is screaming for a solid third supplier...."

Not really, I'm fine with just ATI and Nvidia :D
 
Killdozer said:
The only way the can 'pull it off' is if they get very good at making drivers. If they have cheap hardware and stable drivers then they will get some sales...but if they have cheap hardware and crappy drivers they'll tank.
QFT. When it comes to hardware and drivers being successful, it's pretty much a 50/50 split.

Good Hardware + Bad Drivers = Bad Reputation/Not Recommended

Bad Hardware + Good Drivers = Good for OEM and Ultra-Budget segment.

Good Hardware + Good Drivers = Win Win Situation

:D
 
S3 has always had lots of bizarre driver issues. I would love to see them come out with something interesting though.

I bought their infamous "Savage 2000" back in GF1 days, the one that had the hardware T&L disabled, and then when they enabled it, it slowed the card down! LOL

It also had an interesting tendency to not displays squares of desktop, so your screen would look like a checkerboard with black squares on it.

Neat sawtooth distortions between textures too. The card was almost as fast as a GF1 SDR at some games though IIRC, I think it was as close as S3 ever came to making an actual "gamers" card.
 
Rollo said:
S3 has always had lots of bizarre driver issues. I would love to see them come out with something interesting though.

I bought their infamous "Savage 2000" back in GF1 days, the one that had the hardware T&L disabled, and then when they enabled it, it slowed the card down! LOL

It also had an interesting tendency to not displays squares of desktop, so your screen would look like a checkerboard with black squares on it.

Neat sawtooth distortions between textures too. The card was almost as fast as a GF1 SDR at some games though IIRC, I think it was as close as S3 ever came to making an actual "gamers" card.
Wow I remember those days!

And now we rip graphics card companies because our card crashes Everquest 1 out of every 1,000 uses, and in the right game, and the right situation, with the right amount of filtering, and the right resolution, there is shimmering or what have you......
What a bunch of spoiled brats we've become.
 
Ok, from what I have read at XBit Labs, the new S3 Chrome 27 doesn't seem like a bad card. Admittedly it is a budget card but it seems to be at least on par with my Radeon 9800Pro. It outperforms the ATI X1300 and the nVidia 6600 in most things. With some better drivers it will outperform the ATi X1300 and the nVidia 6600. Not bad at all. I think there is hope for a competitor to the big two, though maybe not on the full bore gaming card market.

A question for those in the know, the review mentions that the anitaliasing for this card uses super-sampling instead of multisampling. Is this a hardware feature or a driver feature? If it is a driver feature, can it be rewritten to use multi sampling to improve performance?
 
Ultra Wide said:
I think their plan is also solid. They are not going for #1 spot, they just want to be the #3. That IMO is an achievable goal and they can just keep building from that success.

That's gonna be quite a feat, considering that's probably where intel is (they're #1 in marketshare too)


EDIT: Whoops, didn't notice the date :p Anyway, how're they doing so far?
 
The piont is, is that S3 is going to bring prices down of other compition, everyone here agrees that they were good cards back in the day! I am only 17, and i remember when S3 was at the top!
 
Molingrad said:
i have heard of this company called 3Dlabs. they make video cards never heard of anyone owning one though
3dlabs
Every couple of months people ask about Quadro's, FireGL's, and Wildcats. Welcome. ;)

Those are CAD/DCC cards and won't help you in games.
 
phataj said:
The piont is, is that S3 is going to bring prices down of other compition, everyone here agrees that they were good cards back in the day! I am only 17, and i remember when S3 was at the top!
I don't agree they were good cards back in the day. S3 made a lot of promises and never delivered. Especially when they were tooting the T&L horn and it never worked.

Their integrated graphics were very nice and stable, but lacked any semblance of useful features at the early dawn of 3D graphics demand.

When they did bring out thir "3D" accelerators, well, they just weren't worth a damn compared to what the competitors brought to the table.

Now that the S27 is out and available for purchase, it hasn't really affected the price of the competitors cards in the same performance category as much as many of us would have liked to see.

Only time will tell if they can take the crown in the similar performance category by releasing drivers that actually reflect what the cards are capable of performance-wise.
 
Admittedly this is an ancient post so it probably isn't even looked at anymore. Howeer there was a question that I posted and I would like to get an answer for it if I could.

I asked:
A question for those in the know, the review mentions that the anitaliasing for this card uses super-sampling instead of multisampling. Is this a hardware feature or a driver feature? If it is a driver feature, can it be rewritten to use multi sampling to improve performance?

Does anyone know the answer to this?
 
Prmetime said:
Does anyone know the answer to this?
Well, this is an old thread, huh?

My understanding is that hardware and software go hand in hand with multisampling/supersampling. If the hardware is capable of rotated grid multisampling (which I'm betting it is), a driver release could enable it. However, if the hardware has the capability, I would imagine S3 would include driver support for it, as pure supersampling is, nine times out of ten, extremely inefficient and wasteful.

Concerning S3, I don't believe they'll ever gain good standing in this market. ATi and nVidia's stranglehold on the market because A) their ultra-high-end products are the "reference point" for consumer confidence and B) they've been pushing very hard at R&D due to the direct and extremely tight competition. I imagine S3 really lacks the capability to design efficient architectures and, most importantly, the capability to write solid drivers.

I'd personally like to see Matrox back in the game, as, unlike S3, some of their past products have been extremely innovative (Parhelia, for instance, really wrote the book on image quality in the day). S3 did give us S3tc, however, which has managed to gain significant ground over the past years.
 
Robstar said:
In this day & age, I _still_ can't figure out why desktop boards can't include integrated graphics that at least hit gef4ti4600 levels....

Rob
Imagine how much heat a decent graphics chipset would add to the motherboard. Also factor into the real estate used up with a sizable heatsink and space for ram chips. Sure you could compact a gf4ti4600 into alot smaller space nowadays and put out less heat, but that'd make it very expensive for what it is.
 
spine said:
Imagine how much heat a decent graphics chipset would add to the motherboard. Also factor into the real estate used up with a sizable heatsink and space for ram chips. Sure you could compact a gf4ti4600 into alot smaller space nowadays and put out less heat, but that'd make it very expensive for what it is.
I don't think real estate will be much of an issue... there are mobile variants of the nvidia 6000 and 7000 series gpu's now. Why not put those on desktop motherboards?
 
Innocence said:
S3 is very....retro

Haha my thoughts exactly. They'll have to offer a really nice price/performance ratio in the midrange to pull ppl away from NV or ATI. And then something thats just ridiculously fast at the highend. I dont see myself buying an S3 card anytime soon though.
 
DejaWiz said:
I don't think real estate will be much of an issue... there are mobile variants of the nvidia 6000 and 7000 series gpu's now. Why not put those on desktop motherboards?
Somebody did, and called it nvidia's 6100 and 6150 chipsets :)
 
spine said:
Imagine how much heat a decent graphics chipset would add to the motherboard. Also factor into the real estate used up with a sizable heatsink and space for ram chips. Sure you could compact a gf4ti4600 into alot smaller space nowadays and put out less heat, but that'd make it very expensive for what it is.

Principally, because desktop uses fall into one of three camps:

- Business users, who could care less about any 3d rendering power *whatever* beyond what is required for Vista. Ergo, they will not pay more for it.

- Gamers, who are always going to put a 3rd-party video card in. And will thus not pay more for better integrated.

- Home users ('mom and pop'), who *might* play one or two games (Sims, anyone?), but 'cranking the settings down' is not something they have any issue with, mediocre performance is no problem, and they will only play those same one or two games for the next 6 years. As a result, they will not pay more for better integrated video, either.
 
jebo_4jc said:
Somebody did, and called it nvidia's 6100 and 6150 chipsets :)
But the 6100 and 6150 chipsets still lack the true gpu power of the mobile 7000 series.
 
Robstar said:
I have never even heard of that chipset and I have been building boxes forever. The nvidia stuff isn't out yet....nuff said :)

Rob

Then try looking around a bit more. LOADS of PCs have the Radeon Xpress chipset in them.
 
Demon_of_The_Fall said:
Then try looking around a bit more. LOADS of PCs have the Radeon Xpress chipset in them.
lol you just rebutted a comment made almost exactly 1 year ago

Besides, I promptly smacked him down with a list of about 10 articles about the chipset :)
 
DejaWiz said:
But the 6100 and 6150 chipsets still lack the true gpu power of the mobile 7000 series.
You're right. But like dderidex said, nobody really wants high powered integrated graphics, when it boils down to it. The system we have today, where the top end technologies trickle down to the integrated market works fine really, considering most integrated gamers only want to play at 800x600 anyway.
 
S3 has been a third player for awhile...I remember running Morrowind with an S3 card. Totally sucked, everytime I walked into a building the screen would turn purple or blue so I'd have to reset the resolution to get it back. Quite annoying....I finally bought myself a nice 9600 pro.
 
jebo_4jc said:
You're right. But like dderidex said, nobody really wants high powered integrated graphics, when it boils down to it. The system we have today, where the top end technologies trickle down to the integrated market works fine really, considering most integrated gamers only want to play at 800x600 anyway.
Is it that nobody wants powerful integrated graphics or nobody gets powerful integrated graphics? My point is, no one has a choice right now. For the purposes of gaming, current integrated graphics are sub-par at best, and completely worthless for all the rest. The story I stated in an earlier post regarding my cousin is a prime example of this.

Most people (a majority of all computer buyers) don't get to choose whether or not they get powerful graphics capability or even know how they can really benefit from it. And most consumer computer sales are from run of the mill/cookie cutter models sold at BB, CC, CUSA (and the like), or from direct sales.

I think stating that most gamers with integrated graphics want to play at 800x600 anyway is a stretch. They don't have a choice. Joe or Mary Sixpack should be able to walk in to a store, but the new game they've been seeing ads for, and play it without a problem. My cousin couldn't even do that with Sims 2. She paid $150 for a sub-par add-in card that barely does the job at 800x600 as it is. If she had the choice to pay $100 - $150 more for stronger integrated graphics at the time she bought the computer, it would have saved her lots of headache, confusion, and future expense.
 
DejaWiz said:
Is it that nobody wants powerful integrated graphics or nobody gets powerful integrated graphics?

For the same price? Of course everyone would take more power for the same price. In case you haven't noticed, though, there is a *touch* (just a slight, really) price delta between a GeForce 6200 Turbocache and a GeForce 7950GX2.

DejaWiz said:
I think stating that most gamers with integrated graphics want to play at 800x600 anyway is a stretch.

Maybe not - but my brother, who IS a 'standard' PC gamer (not hardcore, but certainly plays more than one or two games) didn't know the difference between 800x600 and 1024x768 until I pointed it out. Honestly, "Joe or Mary Sixpack" would probably never realize there was anything wrong with 800x600 at all unless you specifically pointed it out.

DejaWiz said:
Joe or Mary Sixpack should be able to walk in to a store, but the new game they've been seeing ads for, and play it without a problem. My cousin couldn't even do that with Sims 2.

I agree only in part. Should they be able to buy a new computer, then be able to at least play any modern game out there with the integrated graphics? Sure.

Should they be able to buy a computer, then 5 years later go out and buy a new game, and expect it to be playable? Of course not.

Remember, it's not like Best Buy is shy about letting you know - 'This is a web surfing/MS Office system, and won't game well' vs 'This system is killer for gaming'.

DejaWiz said:
She paid $150 for a sub-par add-in card that barely does the job at 800x600 as it is. If she had the choice to pay $100 - $150 more for stronger integrated graphics at the time she bought the computer, it would have saved her lots of headache, confusion, and future expense.

BUT...if she knew, when she bought the computer, that she WANTED a gaming computer...why did she buy one that couldn't play games, and had no upgrade path? Likely, she bought the system not intending it to be for gaming at all, just interested in web surfing and Office stuff and the low price. And that's what she got.

You can't just turn an OEM non-gaming system into a gaming system - they are specifically designed to that end so you DO have to pay more to get more performance. That's why, when buying brick-and-mortar retail, you really need to understand what you will be doing with the system long-term and plan for that.
 
dderidex said:
For the same price? Of course everyone would take more power for the same price. In case you haven't noticed, though, there is a *touch* (just a slight, really) price delta between a GeForce 6200 Turbocache and a GeForce 7950GX2.
LOL, sarcasm noted. And no, not for the same price. As I stated earler, if the consumer can get better integrated graphics for $100 - $150 more in the first place, then at least they have the choice to do so.



dderidex said:
Maybe not - but my brother, who IS a 'standard' PC gamer (not hardcore, but certainly plays more than one or two games) didn't know the difference between 800x600 and 1024x768 until I pointed it out. Honestly, "Joe or Mary Sixpack" would probably never realize there was anything wrong with 800x600 at all unless you specifically pointed it out.
I can agree with this.



dderidex said:
I agree only in part. Should they be able to buy a new computer, then be able to at least play any modern game out there with the integrated graphics? Sure.
Yep, we both agree on this.



dderidex said:
Should they be able to buy a computer, then 5 years later go out and buy a new game, and expect it to be playable? Of course not.
With my "gaming system", I know I won't even be able to play games 5 years from now. If not at all, then not at 1280x1024+ resolution with the eye candy turned up.



dderidex said:
Remember, it's not like Best Buy is shy about letting you know - 'This is a web surfing/MS Office system, and won't game well' vs 'This system is killer for gaming'.
Places like BB also aren't afraid to push a lackluster system because it's on sale. Even if they ask the customer what their needs are and they should be getting a more powerful system.



dderidex said:
BUT...if she knew, when she bought the computer, that she WANTED a gaming computer...why did she buy one that couldn't play games, and had no upgrade path? Likely, she bought the system not intending it to be for gaming at all, just interested in web surfing and Office stuff and the low price. And that's what she got.
She did buy it for web surfing and office as well as gaming. She didn't know it wouldn't play games. She didn't know it had no upgrade path. These are the details often left out because the seller is looking for a repeat sale within the next year or two. Again, she got a system merely because it was on sale and thought it was a good deal. I agree it's partly her fault for not researching on her own, but those questions should have been answered in detail and kept in mind my the sales rep when matching a system to her requirements.

I have smacked some sense into my cousin, so when it comes time for her next system, she now understands that I should be consulted with and build it for her.



dderidex said:
You can't just turn an OEM non-gaming system into a gaming system - they are specifically designed to that end so you DO have to pay more to get more performance. That's why, when buying brick-and-mortar retail, you really need to understand what you will be doing with the system long-term and plan for that.
I guess this is where differences of definitions come in to play. I see systems as being categorized as Enthusiast, Gamer, Basic, and Budget. There is a very fine line between gamer and enthusiast, as we never know what games the devs will release to market and what the recommended system requirements will be. Our current Enthusiast PCs can quickly turn into Gamer or Basic systems in a matter of months depending on what games and apps are released.
 
Back
Top