Rumors of ARC's cancellation...

Not sure I'd agree that Itanium, if you mean ia-64 CPU, was around for 20 years, but as a "project", perhaps.
 
Raja better get his crap together or he will be booted from another GPU company.

Raja is useless. The fact that he's still employed after this debacle is laughable.

it's not like Intel to quit so quickly...

Maybe not Intel historically, but their new CEO has a tendency to cut bait and run if he doesn't think a project is going to be profitable according to some articles I read about this.
 
Raja is useless. The fact that he's still employed after this debacle is laughable.
Two or more years ago I called him out for being useless, and someone here pointed out that he reads this forum and I might make him commit suicide. I felt bad.

He does seem to be full of hot air though,
 
Let see, Raja is suppose to design a GPU virtually from scratch, getting around all Nvidia and ATIi/AMD patents, get drivers good and compete with Nvidia and AMD on the first go around? I would think the 3rd round would be the cut point where Intel has to compete in this area if they are going to remain with the 2nd attempt much better than the 1st. Let see, ARC probably already surpassed Nvidia Pascal and RNDA 1, so how far really is Intel behind? Now if AMD and Nvidia has as big as jump people are jabbering about in this coming generation, Intel will have to work much harder. It still comes down to price/performance, drivers in good enough state. Intel may already have the best video encoder which for many can be very useful. I really wonder how far along Battlemage is? Can it be pushed up?

Intel could use a killer game/software for ARC as well. A great price, Game bundle that works well with it and maybe even some video making software could drive it forward. Talking $200 price range. The price has to very aggressive as in agressively low depending upon how the drivers are and performance.
 
Yeah, they kept Optane around for a few years before finally killing that turkey.

They also kept the Itanium processor around for almost 20 years before killing that one.
Optane was incredibly powerful when it was used as designed - in the datacenter. There was never a reason for a consumer version, really - we have neither the memory controllers for tiered RAM (PMEM), nor the use/need for that level of performance (NVMe storage version). Nor the slots to make the memory version make sense. But none of that is true in the DC space - memory tiering was useful for some workloads, App-Direct mode was VERY useful for large scale DB work (HANA especially), and the storage tier for high-end enterprise arrays was often in use.
Not sure I'd agree that Itanium, if you mean ia-64 CPU, was around for 20 years, but as a "project", perhaps.
20 years on the dot. Released June 2001, final shipments July 2021.
Raja is useless. The fact that he's still employed after this debacle is laughable.



Maybe not Intel historically, but their new CEO has a tendency to cut bait and run if he doesn't think a project is going to be profitable according to some articles I read about this.
I've worked with Gelsinger in the past. He's absurdly competent, highly competitive, VERY driven, and believes strongly in core-competencies and focused efforts. Things that do not drive your core business or what could become a core business is not something to waste time on - even if profitable (sell it off if it is). I can easily see GPUs becoming a core business for them - but I can also easily see it being a distraction from fabrication, CPUs, chipsets, and AI/ML work (all DC focused). It could go either way.
Correct, but 20 years? I mean, I suppose it could be true.
See above.
 
I would be surprised if that were the case, only because of how essential GPUs are becoming in the datacentre space and how essential that space is to Intel's business. They really can't afford not to play in that space.
 
  • Like
Reactions: noko
like this
Optane was incredibly powerful when it was used as designed - in the datacenter. There was never a reason for a consumer version, really - we have neither the memory controllers for tiered RAM (PMEM), nor the use/need for that level of performance (NVMe storage version). Nor the slots to make the memory version make sense. But none of that is true in the DC space - memory tiering was useful for some workloads, App-Direct mode was VERY useful for large scale DB work (HANA especially), and the storage tier for high-end enterprise arrays was often in use.

20 years on the dot. Released June 2001, final shipments July 2021.

I've worked with Gelsinger in the past. He's absurdly competent, highly competitive, VERY driven, and believes strongly in core-competencies and focused efforts. Things that do not drive your core business or what could become a core business is not something to waste time on - even if profitable (sell it off if it is). I can easily see GPUs becoming a core business for them - but I can also easily see it being a distraction from fabrication, CPUs, chipsets, and AI/ML work (all DC focused). It could go either way.

See above.

While I appreciate your perspective:

The fact that ARC was fabbed outside of Intel is also worrying. There are downsides to being fabless. AMD has weathered those.... but can Intel? And can they bring the fab side of the business up to snuff?

These issues run deeper than how good a CEO is at being CEO- or what fancy book on the 486 was written. This takes a visionary with deep understanding of the industry and technology for advanced manufacturing.

What's the one thing he doesn't have experience with? It's building, managing, and running a fab business. He's the wrong guy in the wrong seat. He's not the guy to revitalize the fab operations. Intel needs to advance someone in their manufacturing corps or steal someone from outside who specifically knows how to do this. He doesn't.

Intel needs a manufacturing engineer at the helm. But they've made so much distance between the executive and manufacturing they are blind to this.

In this situation- he's "Gil Amelio" all over again.
 
While I appreciate your perspective:

The fact that ARC was fabbed outside of Intel is also worrying. There are downsides to being fabless. AMD has weathered those.... but can Intel? And can they bring the fab side of the business up to snuff?
They have to - and I'd strongly suspect that Gelsinger believes so as well. Fabs were always one of Intel's strong points - just because they're behind now, doesn't mean they won't be on top again in the future. Especially as Intel has money to invest to improve there - or buy licenses for whatever tech they might need. ARC being outside intel is definitely a slight on it from many perspectives, and I'd see that being something that rankles for Gelsinger as well - it's not a core competency then.
These issues run deeper than how good a CEO is at being CEO- or what fancy book on the 486 was written. This takes a visionary with deep understanding of the industry and technology for advanced manufacturing.

What's the one thing he doesn't have experience with? It's building, managing, and running a fab business.
Eh... he was an engineer there for years before moving into management. He understands the business - from a hands in the dirt level - and there's no question about Gelsinger being a visionary, at least from my perspective. As for taking it over - that's what you have a good COO or EVP for. And he's always done well picking those.
He's the wrong guy in the wrong seat. He's not the guy to revitalize the fab operations. Intel needs to advance someone in their manufacturing corps or steal someone from outside who specifically knows how to do this. He doesn't.

Intel needs a manufacturing engineer at the helm. But they've made so much distance between the executive and manufacturing they are blind to this.

In this situation- he's "Gil Amelio" all over again.
We'll have to agree to disagree there - Gelsinger had no enterprise software experience (really) when he took over VMware, but he understood enterprise business, and the idea of focusing on what your concentration is, and brought them to whole new levels. Replacing Maritz was one of the best things that ever happened there - and the executive management he selected to run the business groups made them billions. He might not know fabs, but he knows engineering, he knows CPUs, he knows why they NEED a fab, and he knows people to bring in and make the necessary changes. The CEO isn't supposed to be making decisions about how to build a Fab - just if they need to be in that business (or the GPU business) or not.
 
Didn't realize HP helped keep that dead horse afloat for so long. Wow. Of course, Java is still alive and so is Cobol.
Less HP and more HP's customers with support guarantees.

If you rewrote your billion-dollar industrial facility's back end so you could convert the whole thing from a sinking ship of DEC VAXes to the Itanic's new hotness, you better believe you're squeezing that support contract for all it's worth...
 
Less HP and more HP's customers with support guarantees.

If you rewrote your billion-dollar industrial facility's back end so you could convert the whole thing from a sinking ship of DEC VAXes to the Itanic's new hotness, you better believe you're squeezing that support contract for all it's worth...
I wonder how thankful and appreciative those Itanic customers were/are now? Keeping support over 2 decades for a dead end tech is like a double edge sword which you get cut on either side.

AI, Quantum computers (IBM seems to be ahead there, who is also working with Intel on 2nm tech from what I read) are forward looking and if successful could make the last era of computers look ancient. Last 4 decades have been exciting, fast pace which many assume is normal (it is not) and got use to it but the next maybe even faster, usual and unexpectant in direction. The internet, smart phones and computers have utterly changed how most people, businesses etc. interact which is no news to anyone here yet I do believe most do not appreciate or understand the utterly dramatic impact, large asteroid level let say impact, on society while living through it or born in it. In other words, taking it for granted.

What this means is, will Intel progress along any new shifts in tech, Manufacturing at quantum effect sizes and succeeding and so on would give them technological advantage more then they ever had in the past. If they don't, I suspect they would wither away like the many companies/adventures over the ages.

https://semiengineering.com/quantum-effects-at-7-5nm/

Can Intel introduce new radically designed GPUs in the future, more Quantum like, useful AI leading to fast, reliable, cough cough reliable AI tech? Now who is going to report, test in a meaningful way new technologies as they explode in the next couple of decades? All the old and current generation of YouTuber creators may just turn into grumpy Grandpa's and Grandma's. What is most disappointing is the canned way current people treat tech, very little foresight or thought on real impact if significant or not. For example ARC from what we know is blah but does it have anything unique? If so what?
 
Less HP and more HP's customers with support guarantees.

If you rewrote your billion-dollar industrial facility's back end so you could convert the whole thing from a sinking ship of DEC VAXes to the Itanic's new hotness, you better believe you're squeezing that support contract for all it's worth...
Or you know, HP could have not killed off the Alpha -_-
 
You mean like Microsoft, etc...?
Seem multiple tier of difference, 10,000 in Microsoft 10 year's ago is now worth $116,000, Intel is now worth $18,000.

Could be some sarcasm, but how the third-highest stock valuation in the world not one of the best run company ?
 
Seem multiple tier of difference, 10,000 in Microsoft 10 year's ago is now worth $116,000, Intel is now worth $18,000.

Could be some sarcasm, but how the third-highest stock valuation in the world not one of the best run company ?
My point is that companies cancel projects (products) all the time. I think I can name at least 100 cancelled Microsoft projects (probably 1000 for IBM). It's not all that unusual.
 
WHAT A TRACK RECORD!!!!!!! 👍

1660497527214.png


Maybe Intel and AMD each should have let him throw more parties... to celebrate.....some type of accomplishment? 🤔
 
My point is that companies cancel projects (products) all the time. I think I can name at least 100 cancelled Microsoft projects (probably 1000 for IBM). It's not all that unusual.

With the level of hype and resources poured in already, it's not just a random canceled project. It would be like MS canceling the entire Xbox program.
 
With the level of hype and resources poured in already, it's not just a random canceled project. It would be like MS canceling the entire Xbox program.
It's not going to be a gaming powerhouse out of the gate but 3rd generation might.
Intel has a chance to do well if the drivers get there.
 
With the level of hype and resources poured in already, it's not just a random canceled project. It would be like MS canceling the entire Xbox program.
I bet they haven't spend half as much on hype as Microsoft did on Windows Phone, just saying.
 
I bet they haven't spend half as much on hype as Microsoft did on Windows Phone, just saying.
Possible but making 4 millions units would not have been cheap, Microsoft made a giant write off of 7.6 billion on the Nokia adventure I think.

Break or not is probably more on the datacenter side of the GPUs (Ponte Vecchio), like Phone for Microsoft, consumer desktop Discrete GPU is not a core business for Intel and does not necessarily need to be (that could change, but it would probably be latter than on the data side that SoC with powerful GPU (or something else linking the GPU to a non Intel cpu) break the market of Intel cpu.
 
Possible but making 4 millions units would not have been cheap, Microsoft made a giant write off of 7.6 billion on the Nokia adventure I think.

Break or not is probably more on the datacenter side of the GPUs (Ponte Vecchio), like Phone for Microsoft, consumer desktop Discrete GPU is not a core business for Intel and does not necessarily need to be (that could change, but it would probably be latter than on the data side that SoC with powerful GPU (or something else linking the GPU to a non Intel cpu) break the market of Intel cpu.
There is tons of money to be had in virtualized hardware market for GPU if Intel can work with someone like Dell to offer another decent vGPU solution outside of nvidia.
 
WHAT A TRACK RECORD!!!!!!! 👍

View attachment 500639


Maybe Intel and AMD each should have let him throw more parties... to celebrate.....some type of accomplishment? 🤔
(from 5 years ago)

kazeohin said:
I was only 30 years old.
I loved AMD so much.
My entire PC was AMD branded.
Every night I prayed to AMD,
thanking them for providing affordable PC components.
"Radeon is Love" I say, "Radeon is Life".
My dad hears me and calls me an AMD Fanboi shill.
I knew he was jealous of my PC's price to performance ratio.
I call him an Nvidiot.
He slaps me and sends me to bed.
I'm crying now and my PC crashed.
I feel a warmth approach me...
It's Raja Koduri!
I'm so happy.
He whispers to me "2.5x performance per watt"
He turns over my PC and pulls out a screwdriver...
I'm ready
I open my PC for Raja
He pulls out a brand new Radeon® RX Vega 64™ video card.
It was barely faster than a 12 month old card that used half the power, but I do it for Raja.
I feel my mainboard overheat as Raj draws too many watts over the PCI-E lanes.
Raj lets out a mighty roar as he updates my drivers
My dad walks in
Raja Koduri looks him straight in the eye and says
"The biggest advancement in GPU architecture since GCN*"
Raj leaves through my window.
Radeon is love. Radeon is life.
 
For a first effort the hardware offers more performance then I expected, if they can get it together and sort out the software mess and also hopefully sort out needing ReBAR enabled everywhere, I think Intel may have an interesting product stack here. I'm old like many here and remember the 90s where there were many companies making video chips/cards and the intense competition led to better and better prices. The fact that there has only been two players in the GPU market now for going on 20+ years is insane to me. I really hope Intel doesn't give up on this project, and maybe I'm weird but I had an 8MB i740 back in the day and it worked just fine for what it was and what I paid for it!
 
G200 alive and kicking in thousands of servers purchasable today. Matrox is doing ok.
... Actually now that I think of it, I have as many Matrox cards active in this house as I do Nvidia. 4 G200s in servers, vs 1080/3070/3080/3090. Damn :p
 
For a first effort the hardware offers more performance then I expected, if they can get it together and sort out the software mess and also hopefully sort out needing ReBAR enabled everywhere, I think Intel may have an interesting product stack here. I'm old like many here and remember the 90s where there were many companies making video chips/cards and the intense competition led to better and better prices. The fact that there has only been two players in the GPU market now for going on 20+ years is insane to me. I really hope Intel doesn't give up on this project, and maybe I'm weird but I had an 8MB i740 back in the day and it worked just fine for what it was and what I paid for it!
Well as time went on the barriers to compete kept going up. As you can clearly see with the struggle Intel is having at making good drivers. There is also all the bundled stuff in there like nvenc, dlss/fsr, etc. And of course getting up to speed on the HW side is also required and not simple even with the iGPU experience Intel already has.
 
For a first effort the hardware offers more performance then I expected, if they can get it together and sort out the software mess and also hopefully sort out needing ReBAR enabled everywhere, I think Intel may have an interesting product stack here. I'm old like many here and remember the 90s where there were many companies making video chips/cards and the intense competition led to better and better prices. The fact that there has only been two players in the GPU market now for going on 20+ years is insane to me. I really hope Intel doesn't give up on this project, and maybe I'm weird but I had an 8MB i740 back in the day and it worked just fine for what it was and what I paid for it!

Intel could be the new S3 in the modern graphics market! How fortunate for them.
 
Back
Top