Rumors of ARC's cancellation...

G200 alive and kicking in thousands of servers purchasable today. Matrox is doing ok.

Yeah that's why I thought of them. They have kept their toe in the water all these years. I bet they could still come up with something new. Whatever they did, it would be interesting.
 
I'll file it right next to Larrabee.
Yeah was going to say, this isn't the first time they canned a card/GPU early.

staknhalo to be fair, AMD actually wasn't totally wrong on Volta, if you consider that they never released a consumer product line.
 
Last edited:
They have to - and I'd strongly suspect that Gelsinger believes so as well. Fabs were always one of Intel's strong points - just because they're behind now, doesn't mean they won't be on top again in the future. Especially as Intel has money to invest to improve there - or buy licenses for whatever tech they might need. ARC being outside intel is definitely a slight on it from many perspectives, and I'd see that being something that rankles for Gelsinger as well - it's not a core competency then.

Eh... he was an engineer there for years before moving into management. He understands the business - from a hands in the dirt level - and there's no question about Gelsinger being a visionary, at least from my perspective. As for taking it over - that's what you have a good COO or EVP for. And he's always done well picking those.

We'll have to agree to disagree there - Gelsinger had no enterprise software experience (really) when he took over VMware, but he understood enterprise business, and the idea of focusing on what your concentration is, and brought them to whole new levels. Replacing Maritz was one of the best things that ever happened there - and the executive management he selected to run the business groups made them billions. He might not know fabs, but he knows engineering, he knows CPUs, he knows why they NEED a fab, and he knows people to bring in and make the necessary changes. The CEO isn't supposed to be making decisions about how to build a Fab - just if they need to be in that business (or the GPU business) or not.

I had to go back and do a little more reading before responding. You had almost convinced me to see things your way.

However, the Amelio is strong with this one.

Intel screwed up 10 and 7nm, they are now fabbing outside of Intel, while opening the fabs they do have to outsiders, while trying to rebuild a fab business in which they are terribly behind, while trying to build a GPU business during a GPU glut with prices falling and crappy performance of their product....

Not to mention the killing of Optane.

Can we define what business this company is in? Because honestly I have no idea....

Right now Intel reminds me of Apple in the late 90s- I lived through that as a systems engineer. And Intel, like Apple in 1997, is simply pointed in too many directions. It's not sustainable.

Gelsinger could be the embodiment of the returned Christ- though he still has to live by the laws of the universe. And those laws infer a high probability that a business divided against itself cannot stand.

(Sorry for the biblical reference stuff... I'm feeling wistful today)

So where's the beef here? All I see is a company flailing away and not producing much at all. They really aren't. At the rate they are going in 3 years you'll need a payloader to back that new 10nm Intel chip up to your house and a crew of ten just to drop it into your motherboard (which will be submerged in your backyard pool for "cooling purposes").

Looks like Intel just announced a chip with "Exteme 350 Watt Performance Mode". No I'm not kidding. Is this indicative of a company with it's #%*# together?

Yes, I'm being obtuse (The 350 watt thing is true however). My point being that the vortex is in motion but we haven't heard the gurgle yet.
 
Last edited:
I had to go back and do a little more reading before responding. You had almost convinced me to see things your way.

However, the Amelio is strong with this one.

Intel screwed up 10 and 7nm, they are now fabbing outside of Intel, while opening the fabs they do have to outsiders, while trying to rebuild a fab business in which they are terribly behind, while trying to build a GPU business during a GPU glut with prices falling and crappy performance of their product....

Not to mention the killing of Optane.

Can we define what business this company is in? Because honestly I have no idea....

Right now Intel reminds me of Apple in the late 90s- I lived through that as a systems engineer. And Intel, like Apple in 1997, is simply pointed in too many directions. It's not sustainable.

Gelsinger could be the embodiment of the returned Christ- though he still has to live by the laws of the universe. And those laws infer a high probability that a business divided against itself cannot stand.

(Sorry for the biblical reference stuff... I'm feeling wistful today)

So where's the beef here? All I see is a company flailing away and not producing much at all. They really aren't. At the rate they are going in 3 years you'll need a payloader to back that new 10nm Intel chip up to your house and a crew of ten just to drop it into your motherboard (which will be submerged in your backyard pool for "cooling purposes").

Looks like Intel just announced a chip with "Exteme 350 Watt Performance Mode". No I'm not kidding. Is this indicative of a company with it's #%*# together?

Yes, I'm being obtuse (The 350 watt thing is true however). My point being that the vortex is in motion but we haven't heard the gurgle yet.
Going to need bigger cases to fit the new 540mm rads for the $350 AIO.
 
I don't even think a 540mm rad would cut it if I was cooling a 700w AMD GPU.

1660923521154.png
 
This is the first generation where I've intended to upgrade, but am holding off because I seriously want to hear about people power bills before I buy.
 
I'm conflicted. I want to laugh at Intel and Raja and that snakeoil salesman Ryan, but also cry for having to put up with Nvidia and AMD indefinitely.
 
They clearly weren't really interested in competing with high performance GPUs. This entire exercise seems more like it was about making better APUs.
 
I had to go back and do a little more reading before responding. You had almost convinced me to see things your way.

However, the Amelio is strong with this one.
Pat Gelsinger is definitely not Gil Amelio - aside from having had a much more varied career prior to taking the helm of Intel, he's also been much more successful - plus, say what you will, Gil bought NeXT - which ended up saving Apple in the end (and Jobs also screwed him with that stock dump - brilliant move). Gelsinger comes from a later period in the industry (he's 20 years younger), and has already done this job once (Apple was Gil's first time at CEO - he'd only been president before, while Gelsinger has already been CEO once, was heir-apparent to Michael Dell, and ran other independent groups inside of EMC prior to being made CEO too).
Intel screwed up 10 and 7nm, they are now fabbing outside of Intel, while opening the fabs they do have to outsiders, while trying to rebuild a fab business in which they are terribly behind, while trying to build a GPU business during a GPU glut with prices falling and crappy performance of their product....
Oh agreed, they're in a shitty spot now - you can't turn a company around fast, but part of it is shedding things that aren't core business.
Not to mention the killing of Optane.
This is partially because it wasn't ever the success at the scale they thought it would be - and it's also being replaced by something "better" (supposedly CXL will blow it out of the water, but we'll have to see once it hits the actual ground - no one knows till then). Optane was somewhat of a stop-gap, even in the DC market (see also: Liqid, BULL computers, etc).
Can we define what business this company is in? Because honestly I have no idea....
They're a semiconductor and software company.
Right now Intel reminds me of Apple in the late 90s- I lived through that as a systems engineer. And Intel, like Apple in 1997, is simply pointed in too many directions. It's not sustainable.
Absolutely - and that's what Gelsinger is historically good at fixing, by cutting a lot of those directions off at the knees.
Gelsinger could be the embodiment of the returned Christ- though he still has to live by the laws of the universe. And those laws infer a high probability that a business divided against itself cannot stand.
See prior statement :) VMware owned Zimbra - an Email server company - and SlideRocket (slides!) when he started. Talk about divided.
(Sorry for the biblical reference stuff... I'm feeling wistful today)

So where's the beef here? All I see is a company flailing away and not producing much at all. They really aren't. At the rate they are going in 3 years you'll need a payloader to back that new 10nm Intel chip up to your house and a crew of ten just to drop it into your motherboard (which will be submerged in your backyard pool for "cooling purposes").
Agreed entirely. The difference here is I've watched Gelsinger chop off a lot of crap to fix a company like that before, and I suspect he'll do it again (from what I'm seeing already).
Looks like Intel just announced a chip with "Exteme 350 Watt Performance Mode". No I'm not kidding. Is this indicative of a company with it's #%*# together?
He's been CEO for what, 18 months? It takes 2-3 years to tape out silicon, longer still to get it to market. You can't change pyhsics. And you can't fix that without simply ditching ALL your products. And that gets the shareholders to riot. Have to generate revenue till then, no matter what. Software you can kill fast - silly product lines you can kill fast. If intel stopped making CPUs for 3 years, they'd cease to exist and Gelsinger would be held up as an example of "the fastest way to commit career suicide ever."
Yes, I'm being obtuse (The 350 watt thing is true however). My point being that the vortex is in motion but we haven't heard the gurgle yet.
Oh agreed - but I have faith in good leaders, and working for/with Gelsinger was working with one of the best I've ever had the pleasure of.
 
I'm conflicted. I want to laugh at Intel and Raja and that snakeoil salesman Ryan, but also cry for having to put up with Nvidia and AMD indefinitely.

I feel the same strictly from a consumer perspective, even if I personally prefer Nvidia there, just to have a 3rd party I'd probably never buy anyway help try to keep prices in check.

But it's hard to mourn the loss of something you never truly had anyway.

So the answer is laugh at Raja.
 
is a good decision- it would have taken them minimum two generations to catch-up with NVIDIA outstanding memory compression, let-alone AMDs massive cache doubling performance in a single gen! Then the chronic driver and firmware bugs will never have to be solved !

i mean, its not like there will ever be another Etherium rush after the final switch in a week; lintel can continue to make money off their dedicated bitcoin sics, and ditch discrete for their third time in history
 
I can't help but think Intel can recoup a lot of their investments by making capable APUs for the entry-level and mid-range markets. They're going to need to compete with AMD who is going to have all-in-one parts no matter what. They almost can't afford to give up graphics, even if it means doing away with discrete.

It's almost like they should have been doing this the whole time...
 
This is such a huge mistake and completely short sighted... How could they have thought their first generation of cards would be any kind of success? They needed another 2 generations to become competitive. The idea that they'd launch this endeavor at all to begin with without that kind of investment is embarrassing.
 
I can't help but think Intel can recoup a lot of their investments by making capable APUs for the entry-level and mid-range markets. They're going to need to compete with AMD who is going to have all-in-one parts no matter what. They almost can't afford to give up graphics, even if it means doing away with discrete.

It's almost like they should have been doing this the whole time...
That my feeling has well, seem too much of a risk to have AMD, NVIDIA, Amazon, Alibaba-cloud, Apple, Huawai, Qualcomm, etc... out there making complete compute ability solution for workload that like GPU type of compute but not you, if they can spin off a discrete GPU product out of that work good but it is maybe not necessary (and the headache of the decades of game driver has well) versus data-ai-crypto-etc... center solutions.
 
This is such a huge mistake and completely short sighted... How could they have thought their first generation of cards would be any kind of success? They needed another 2 generations to become competitive. The idea that they'd launch this endeavor at all to begin with without that kind of investment is embarrassing.

when you're recreating the complexity of a modern high-end supercomputer in a slot, it become nearly impossible to make any money when yo already have two well-established competitors you have to displace from the market - its the same reason most broadband companies wont invest in an already crowded market

oh,. and reason 2: if the etherium going proof-of-stake permanent kills GPU mining centers, we will be returning back to thew days of falling discrete card sales;its a lot easier to make i integrated graphics pay-off

jpr_q2_2016_mkt_historical_annual_gpu_sales_575px.png
 
Last edited:
I feel the same strictly from a consumer perspective, even if I personally prefer Nvidia there, just to have a 3rd party I'd probably never buy anyway help try to keep prices in check.

The problem is if too many people think "someone else can buy Intel so muh Nvidia is cheaper" then there wouldn't be enough people buying Intel to keep prices in check.
 
Last edited:
This is such a huge mistake and completely short sighted... How could they have thought their first generation of cards would be any kind of success? They needed another 2 generations to become competitive. The idea that they'd launch this endeavor at all to begin with without that kind of investment is embarrassing.
I mean, if the price was right, a scaled up iGPU with ram in a pci-e slot would be saleable. There's a market of people looking for something to accelerate video encode/decode. It would cut into their processor market, but there's people using several generations old desktop processors with iGPU that need a little more GPU, and would rather update that than get a new motherboard and a new processor and new ram. Again, it doesn't help their processor market, but I'd have bought one for the right price for a budget AM4 home server, although the video out in every AM5 processor will solve my needs there. OTOH, the drivers seem like a mess, which AFAIK isn't the case for the iGPUs, and other intel products have drivers with working installers, so I just don't know wtf was going on there. And the reported performance didn't seem to justify the price either? If it were me, I'd have built from the bottom of the market --- with less fiddly drivers, the a380 could be a value at $100, and release something less capable for $50 to fit the niches of people who need something but not much. Next gen, try to make a good value at $200, but also have a model for $50 and $100. Keep working on it, and in a few generations (faster if someone else stumbles) you can throw together something high end. But nobody lets me run their company ;)
 
I mean, if the price was right, a scaled up iGPU with ram in a pci-e slot would be saleable. There's a market of people looking for something to accelerate video encode/decode. It would cut into their processor market, but there's people using several generations old desktop processors with iGPU that need a little more GPU, and would rather update that than get a new motherboard and a new processor and new ram. Again, it doesn't help their processor market, but I'd have bought one for the right price for a budget AM4 home server, although the video out in every AM5 processor will solve my needs there. OTOH, the drivers seem like a mess, which AFAIK isn't the case for the iGPUs, and other intel products have drivers with working installers, so I just don't know wtf was going on there. ;)

when you ghave to transition from easy 128-bit ddr6 buses to much higher blatancy 256-bit gddr6, it adds a lot of complexity

and even though AMD knew what they were doing with second-gen GDDR5 bus on GCN, they spent so much effort getting the compute shaders with asynchnous compute working they had no time to fix yje new memory controllers massive latency before launch!
 
If the rumors of cancellation are true this should bum everyone out. Not that the first gen of these card is any good. But if any company could over time make a good discrete card it's Intel. They have tons of money and could get the talent required.

Cancelling now leaves us as we were before. A duopoly.
 
If the rumors of cancellation are true this should bum everyone out. Not that the first gen of these card is any good. But if any company could over time make a good discrete card it's Intel. They have tons of money and could get the talent required.

Cancelling now leaves us as we were before. A duopoly.
The reason we have a duopoly is because discrete card sales have been falling since 2007 (see my previous post for graph)

Assuming that the temporary bump from etherium is over You can't make money unless you have enough enterprise hardware to raise margins
 
Last edited:
The reason we have a duopoly is because discrete card sales have been falling since 2007
isn't somewhat a "natural" place to be, Coke vs Pepsi, Amd vs Intel in x86, PlayStation vs Xbox in PC like console, iOS vs Android, Google vs Facebook in ads revenue.

We were quite in a duopoly from 2000 to 2007 has well, the days of a Matrox vs S3 vs PowerVR vs 3dfx vs ATI vs Nvidia was quite gone (in the gamers PC add-in card space) before the decline in units sales.
 
isn't somewhat a "natural" place to be, Coke vs Pepsi, Amd vs Intel in x86, PlayStation vs Xbox in PC like console, iOS vs Android, Google vs Facebook in ads revenue.

We were quite in a duopoly from 2000 to 2007 has well, the days of a Matrox vs S3 vs PowerVR vs 3dfx vs ATI vs Nvidia was quite gone (in the gamers PC add-in card space) before the decline in units sales.

And the truth of the matter is, people don't actually want choice. Don't confuse internet fanbois arguing for the population writ large, even the consumer GPU purchasing population writ large.

People just want to know what to buy without thinking about it - hence Apple's dominance. They have other things to worry about more important to them.
 
isn't somewhat a "natural" place to be, Coke vs Pepsi, Amd vs Intel in x86, PlayStation vs Xbox in PC like console, iOS vs Android, Google vs Facebook in ads revenue.

We were quite in a duopoly from 2000 to 2007 has well, the days of a Matrox vs S3 vs PowerVR vs 3dfx vs ATI vs Nvidia was quite gone (in the gamers PC add-in card space) before the decline in units sales.

It's true, but these other companies were all comparatively tiny.;Intel is the first serious competition in two decades!

But it's going to be an uphill battle for anyone to justify joining this falling market
 
isn't somewhat a "natural" place to be, Coke vs Pepsi, Amd vs Intel in x86, PlayStation vs Xbox in PC like console, iOS vs Android, Google vs Facebook in ads revenue.

We were quite in a duopoly from 2000 to 2007 has well, the days of a Matrox vs S3 vs PowerVR vs 3dfx vs ATI vs Nvidia was quite gone (in the gamers PC add-in card space) before the decline in units sales.
I would never use the word "natural". I would argue it's a typical place to be in our society. I can attest with the streaming wars it's annoying to want to watch something only to find out it's on one of the other 30 streaming services I don't subscribe to. However having only two choices continuously screws over the consumer. We should be encouraging some choice.
 
Idk, I'm convinced that most people don't actually want choice and just want "competition" to make it more convenient to keep buying the same brand.
I was really hoping for a new "RX580" type card.... Upper mid range, but hella cheap. That would have had me running AMD systems with Intel graphics. Unfortunately the universe clearly couldn't handle such a thing, and balance must be maintained.
 
Idk, I'm convinced that most people don't actually want choice and just want "competition" to make it more convenient to keep buying the same brand.
As someone who used Cyrix processors and misses choice, this makes me sad if true.
 
Idk, I'm convinced that most people don't actually want choice and just want "competition" to make it more convenient to keep buying the same brand.
I gave it a try. Went back to a GTX1070. I really wanted it to work but no drivers worked on my 2700x system even though rebar was enabled.
Absolutely no video output. Recognized by software but did not even spin up the fan and though software recognized it in slave, nothing worked.
RMAed it.
 
Oh agreed - but I have faith in good leaders, and working for/with Gelsinger was working with one of the best I've ever had the pleasure of.

Yea well- this is going to take a Jobsian type turnaround. You can fanboi about your friend all you want- Im sure he's a great kisser. From my perspective: I've already worked for a company in this condition and understand intimately what needs to happen.

Intel is finished as an industry leader unless they can catch up on process. AMD is pumping out 5nm designs while knee deep in Zen 5 development.

There's some slight of hand going on here:

The first "joke" is e-cores. Now in a laptop this is completely understandable (Apple's primary market). But not in high performance computing. The only reason Intel went that route is to stretch 10nm as far as possible. And they are hitting a wall.

Unless they outsource their manufacturing right now- they're done. They don't even have a 7nm part. And Zen 4 is cruising at 5nm. A 2023 7nm release coming from an Intel fab isn't going to save them. AMD has legs with Zen 4 and 5. There's lots of headroom there.

How much you want to bet that e-cores go away in x86 land assuming Intel catches up in process? It's a lock.

They are, in short, pucked.
 
  • Like
Reactions: Axman
like this
Back
Top