Intel Is Investing $20 Billion Towards a Massive New Semiconductor Plant

Intel has to make these investments, been falling behind to the point that Apple gave them the boot and designed their own better chips.

Process node isn't specifically the reason why Apple went with its own silicon. That's often just a leapfrogging game. They have a broad license from ARM and are free to make modifications for their own purposes. The reason why they ditched PowerPC (IBM and Freescale) for Intel was that Intel had better power management features and lower power use. But Apple is really big on power management now and they figured that they could do better with their own silicon.
 
Process node isn't specifically the reason why Apple went with its own silicon. That's often just a leapfrogging game. They have a broad license from ARM and are free to make modifications for their own purposes. The reason why they ditched PowerPC (IBM and Freescale) for Intel was that Intel had better power management features and lower power use. But Apple is really big on power management now and they figured that they could do better with their own silicon.

Usually, nodes and powers correlate. In the old days Intel was pretty much the only one who could afford the latest nodes, but then they get cocky and the last couple incompetent CEOs decided to make money off monopoly than investing further, so.....

Andy Grove would never let that happen.
 
Usually, nodes and powers correlate. In the old days Intel was pretty much the only one who could afford the latest nodes, but then they get cocky and the last couple incompetent CEOs decided to make money off monopoly than investing further, so.....

Andy Grove would never let that happen.

Certainly when Intel was well ahead of anyone else they had that advantage, but Apple's decision to go with its own silicon seemed to be a lot more about architectural control over features and performance. I even remember back when Apple's processor silicon only went into iPhones/iPads and Apple was using both Samsung and TSMC as foundries.

Heck - Intel's move into the foundry business was supposed to be revolutionary because at last the most advanced processes would be available. Some of their customers were supposedly cutting edge companies like Tabula. That didn't work out too well.
 
Certainly when Intel was well ahead of anyone else they had that advantage, but Apple's decision to go with its own silicon seemed to be a lot more about architectural control over features and performance. I even remember back when Apple's processor silicon only went into iPhones/iPads and Apple was using both Samsung and TSMC as foundries.

Heck - Intel's move into the foundry business was supposed to be revolutionary because at last the most advanced processes would be available. Some of their customers were supposedly cutting edge companies like Tabula. That didn't work out too well.
I heard Nokia got burnt real bad by Intel's screw-up using their foundry. Apple's target is not as much about the general one size fits all market like Intel (server using similar instruction set as a laptop or even trying to force OEM to do phones and tablets), they, therefore, would never need a lot of the power-hungry designs Intel tries to shove into all things.

I wonder if one day we will see AMD doing ARM/x86 hybrid to get into mobile market, because Qualcomm would certainly try to get from mobile into laptop eventually.
 
Process node isn't specifically the reason why Apple went with its own silicon. That's often just a leapfrogging game. They have a broad license from ARM and are free to make modifications for their own purposes. The reason why they ditched PowerPC (IBM and Freescale) for Intel was that Intel had better power management features and lower power use. But Apple is really big on power management now and they figured that they could do better with their own silicon.
I dont follow this as much as others but if interested I have no problem being "checked"
Im almost certain that Intel could not make the chip that Apple wanted, they didnt have the capability to squeeze/shrink more transistors onto the chips, whether this is the key reason I dont know but I do know its been speculated for a long time now. I thought this only from the "investment" point of view as where I learned it, a Wall Street perspective at the time.
I briefly looked this up but have read about it some time ago, not necessarily from this publication and think it was an investment piece. Intel always behind, like 3 years behind even with their 10nm chips never mind the foreign manufacturers new 7nm chips.

"Stepping back and picking this announcement apart a bit, we had a good idea that Apple would be stepping away from Intel because, frankly, the company has continuously dropped the ball in the last handful of years. Chipmakers like Qualcomm and Samsung are already delivering products based on a second generation of 7nm lithography while Intel struggled mightily just to get 10nm chips into products late last year. What this means is that, in a relatively short window, Macs with Intel chips have gone from the best performance-per-watt to the worst."

Source = https://www.inputmag.com/tech/apple-arm-chip-cpu-hardware-details

The reason I knew of issues is I was looking at it from a beat up company investment outlook. Buying a company beat into the ground from missteps. Glad I never touched this one.
 
Last edited:
I heard Nokia got burnt real bad by Intel's screw-up using their foundry. Apple's target is not as much about the general one size fits all market like Intel (server using similar instruction set as a laptop or even trying to force OEM to do phones and tablets), they, therefore, would never need a lot of the power-hungry designs Intel tries to shove into all things.

I wonder if one day we will see AMD doing ARM/x86 hybrid to get into mobile market, because Qualcomm would certainly try to get from mobile into laptop eventually.

Would have been interesting if Nvidia’s purchase of ARM succeeded.
 
Running clean rooms is very energy hungry. Where will these new high dollar chip plants get there power whith old electric powerplants becoming too costly to meet stricter polution requirements, electric vehicles becoming more popular, and nuclear plants imposible to build in the United States?
 
I dont follow this as much as others but if interested I have no problem being "checked"
Im almost certain that Intel could not make the chip that Apple wanted, they didnt have the capability to squeeze/shrink more transistors onto the chips, whether this is the key reason I dont know but I do know its been speculated for a long time now. I thought this only from the "investment" point of view as where I learned it, a Wall Street perspective at the time.
I briefly looked this up but have read about it some time ago, not necessarily from this publication and think it was an investment piece. Intel always behind, like 3 years behind even with their 10nm chips never mind the foreign manufacturers new 7nm chips.

"Stepping back and picking this announcement apart a bit, we had a good idea that Apple would be stepping away from Intel because, frankly, the company has continuously dropped the ball in the last handful of years. Chipmakers like Qualcomm and Samsung are already delivering products based on a second generation of 7nm lithography while Intel struggled mightily just to get 10nm chips into products late last year. What this means is that, in a relatively short window, Macs with Intel chips have gone from the best performance-per-watt to the worst."

Source = https://www.inputmag.com/tech/apple-arm-chip-cpu-hardware-details

The reason I knew of issues is I was looking at it from a beat up company investment outlook. Buying a company beat into the ground from missteps. Glad I never touched this one.

Bolded and highlighted the bit that was the point @y_p_w was making. Performance per watt, which he was referencing with power management.

Also, this gives Apple the ability to use the same instruction set and basically same silicon across all their i-devices, which makes sense as they are trying to harmonize/unify their operating system platform as well. I'm sure you noticed many of the iPhone/iPad features showing up on MacOS over the last few releases.
 
Bolded and highlighted the bit that was the point @y_p_w was making. Performance per watt, which he was referencing with power management.

Also, this gives Apple the ability to use the same instruction set and basically same silicon across all their i-devices, which makes sense as they are trying to harmonize/unify their operating system platform as well. I'm sure you noticed many of the iPhone/iPad features showing up on MacOS over the last few releases.

This all started with Apple's purchase of PA Semi back in 2008. Nobody really knew what Apple was doing then, as they had just switched to Intel's processors for their computers in 2006, and were using Samsung's ARM-based processors for the first iPhones. PA Semi specialized in customized POWER architecture processors mean for power efficiency, and there were some concerns about whether or not Apple was going to discontinue support for many of PA Semi's military customers (Apple pledged that while they would no longer make new products in that line, they would continue legacy support). For some reason, Apple started off their naming of their first in-house processors with A4, but obviously they brought in PA Semi to bring in their employees who knew how to customize processor designs.
 
This all started with Apple's purchase of PA Semi back in 2008. Nobody really knew what Apple was doing then, as they had just switched to Intel's processors for their computers in 2006, and were using Samsung's ARM-based processors for the first iPhones. PA Semi specialized in customized POWER architecture processors mean for power efficiency, and there were some concerns about whether or not Apple was going to discontinue support for many of PA Semi's military customers (Apple pledged that while they would no longer make new products in that line, they would continue legacy support). For some reason, Apple started off their naming of their first in-house processors with A4, but obviously they brought in PA Semi to bring in their employees who knew how to customize processor designs.
Apple always have a lot of work going on way before other companies on "future" stuff. They often have product that was done but not quite good (i.e. imagine an apple watch that won't last at least 1 day without charging), and decide to scrap them instead of releasing to the market.

ARM was obvious for a long time for PC, Apple just decided it wasn't ready for prime time and keep designing the next gen until finally it is ready. They don't want to be the last one on an old architecture like the G4 days. Will be interesting if / when data center will move to ARM next as x86 even with AMD is still too power hungry for a lot of stuff. Not everything needs to be run on a Xeon, sometimes a bunch of ARMs are the way to go.
 
Would have been interesting if Nvidia’s purchase of ARM succeeded.
If they did it would be full on for data center chips. They have GPUs, they have Mellanox, and if they have ARM they can put them all together for a very good package and sell them as the only one who can do GPU / ARM to fabric.
 
Bolded and highlighted the bit that was the point @y_p_w was making. Performance per watt, which he was referencing with power management.

Also, this gives Apple the ability to use the same instruction set and basically same silicon across all their i-devices, which makes sense as they are trying to harmonize/unify their operating system platform as well. I'm sure you noticed many of the iPhone/iPad features showing up on MacOS over the last few releases.

I dont follow this as much as others but if interested I have no problem being "checked"
Im almost certain that Intel could not make the chip that Apple wanted, they didnt have the capability to squeeze/shrink more transistors onto the chips, whether this is the key reason I dont know but I do know its been speculated for a long time now. I thought this only from the "investment" point of view as where I learned it, a Wall Street perspective at the time.
I briefly looked this up but have read about it some time ago, not necessarily from this publication and think it was an investment piece. Intel always behind, like 3 years behind even with their 10nm chips never mind the foreign manufacturers new 7nm chips.

"Stepping back and picking this announcement apart a bit, we had a good idea that Apple would be stepping away from Intel because, frankly, the company has continuously dropped the ball in the last handful of years. Chipmakers like Qualcomm and Samsung are already delivering products based on a second generation of 7nm lithography while Intel struggled mightily just to get 10nm chips into products late last year. What this means is that, in a relatively short window, Macs with Intel chips have gone from the best performance-per-watt to the worst."

Source = https://www.inputmag.com/tech/apple-arm-chip-cpu-hardware-details

The reason I knew of issues is I was looking at it from a beat up company investment outlook. Buying a company beat into the ground from missteps. Glad I never touched this one.

In a nutshell, we all know that DUV with lens + water immersion (water has high index of reflection) + overlay (stacking multiple mask together) based lithography would not scale past 14nm very well. Everyone was giving up (Nikon and Canon) except ASML. So TSMC invest enough early on when ASML finally figure out how to get EUV working (with molybendum mirror in vacuum, as air and lenses would absorb EUV and reduces the energy of the laser). Intel, either placing cheap bet not paying $20B each to build that kind of FAB, or figuring that since they are monopoly they can do what they want even if they are behind the curve, decided to keep tweaking the older tech instead.

It would have been OK if 1) Apple didn't leave and have no where else to go and 2) AMD decided not to sell their FAB and go to TSMC. All of a sudden Intel is now caught with their pants down with nothing to compete for (by most estimate) 5 years or (by their admission) 2 years.

The part of Intel's miscalculation was that they don't believe anyone will be able to afford building chips with EUV. They were sort of correct except Apple doesn't make money on their chips, they make money off phones and services. Same for Amazon and Google and Facebook, they can afford to lose a ton of money off chips and make them back in cloud services (50% profit margin), and as for Nvidia, they make a ton of money because AI / ML / crypto mining is hot right now, AMD too because of GPU demand, so that helped their CPU business as well.

They still make a lot of money, but how long will it last? Will they go down like Motorola or IBM? or will they revive with the new CEO making the right choice going forward? I'm placing some bets and will see in 5 years, wish me luck. Hopefully they can build a competent foundry business, a good GPU, and get CPU business back in order.
 
In a nutshell, we all know that DUV with lens + water immersion (water has high index of reflection) + overlay (stacking multiple mask together) based lithography would not scale past 14nm very well. Everyone was giving up (Nikon and Canon) except ASML. So TSMC invest enough early on when ASML finally figure out how to get EUV working (with molybendum mirror in vacuum, as air and lenses would absorb EUV and reduces the energy of the laser). Intel, either placing cheap bet not paying $20B each to build that kind of FAB, or figuring that since they are monopoly they can do what they want even if they are behind the curve, decided to keep tweaking the older tech instead.

It would have been OK if 1) Apple didn't leave and have no where else to go and 2) AMD decided not to sell their FAB and go to TSMC. All of a sudden Intel is now caught with their pants down with nothing to compete for (by most estimate) 5 years or (by their admission) 2 years.

The part of Intel's miscalculation was that they don't believe anyone will be able to afford building chips with EUV. They were sort of correct except Apple doesn't make money on their chips, they make money off phones and services. Same for Amazon and Google and Facebook, they can afford to lose a ton of money off chips and make them back in cloud services (50% profit margin), and as for Nvidia, they make a ton of money because AI / ML / crypto mining is hot right now, AMD too because of GPU demand, so that helped their CPU business as well.

They still make a lot of money, but how long will it last? Will they go down like Motorola or IBM? or will they revive with the new CEO making the right choice going forward? I'm placing some bets and will see in 5 years, wish me luck. Hopefully they can build a competent foundry business, a good GPU, and get CPU business back in order.
Reminds me of when Intel was trying to buy NVidia, remember that?
 
Reminds me of when Intel was trying to buy NVidia, remember that?
Not remembering that, but I LOLed when they lowballed Mellanox and lost the deal. Intel should have bought Xilinx instead of Altera, probably the CEO listened to the accountant instead of the engineers and customers.
 
Not remembering that, but I LOLed when they lowballed Mellanox and lost the deal. Intel should have bought Xilinx instead of Altera, probably the CEO listened to the accountant instead of the engineers and customers.
This was the rumour:

Later, form what I recall, the price that NVidia came up with had Intel balk and nothing ever became of it.
 
Back
Top