By Asa Fitch
The world's largest semiconductor companies face a growing
competitive threat: their biggest customers making their own chips
tailored to the supercharged areas of cloud-computing and
artificial intelligence.
Chip making has long been ruled by big manufacturers and design
houses such as Intel Corp., Advanced Micro Devices Inc. and
graphics-chip maker Nvidia Corp. Now Amazon.com Inc., Microsoft
Corp. and Google are getting into the game in the hunt for improved
performance and lower costs, shifting the balance of power in the
industry and pushing traditional chip makers to respond by building
more specialized chips for major customers.
Amazon this month unveiled a new chip that, it says, promises to
speed up how algorithms that use artificial intelligence learn from
data. The company has already designed other processors for its
cloud-computing arm, called Amazon Web Services, including the
brains of computers known as central processing units.
The pandemic has accelerated the rise of cloud-computing as
companies broadly have embraced the kind of digital tools using
those remote servers. Amazon, Microsoft, Google and others have
enjoyed strong growth in the cloud during the remote-work
period.
Business customers also are showing an increased appetite for
analyzing the data they gather on their products and customers,
fueling demand for artificial intelligence tools to make sense of
all that information.
Google was an early mover among the tech giants, releasing an AI
processor in 2016, and has updated the hardware several times
since. Software giant Microsoft, the No. 2 in the cloud behind
Amazon, has also invested in chip designs, including a programmable
chip to handle AI and another that enhances security. It's now also
working on a central processor, according to a person familiar with
its plans. Bloomberg News previously reported Microsoft's CPU
effort.
Driving the tech giants' moves are changes in how the
semiconductor world operates and a growing sense that Moore's Law
-- the sector's fundamental assumption about the steady improvement
in chip performance -- is losing relevance. As a result, companies
are searching for new ways to eke out better performance, not
always measured in speed, but sometimes lower power consumption or
heat generation.
"Moore's Law has been around for 55 years, and this is the first
time it's slowed down very materially," said Partha Ranganathan, a
vice president and engineering fellow in Google's cloud unit, which
has been pursuing specialized chips.
The sheer size of the cloud giants presents a challenge for
traditional chip producers. In the past, the semiconductor makers
tended to design their high-performance semiconductors for generic
applications, leaving it to customers to adapt and get the most out
of the chips. Now the biggest customers have the financial muscle
to push for more optimized designs.
"Whereas Intel in the 1990s was an order of magnitude larger
than all their customers, now the customer has superior scale over
the supplier," said James Wang, an analyst at New York
money-manager ARK Investment Management. "As a result, they have
more capital and more expertise to take components in-house."
Nvidia, now the largest U.S. chip maker by market cap, has a
value of $330 billion, and Intel is at $207 billion. The cloud
behemoths, Amazon, Microsoft and Google-parent Alphabet Inc., each
top $1 trillion in market valuation.
The bespoke efforts are partly made possible by the rise of
contract chip makers, which make semiconductors designed by other
companies. This arrangement helps tech giants avoid the
multibillion-dollar cost of building their own chip factories.
Taiwan Semiconductor Manufacturing Co., in particular, has jumped
to the forefront of chip production technology.
The changes have benefited chip-design firm Arm Holdings Ltd.,
which sells circuit designs that anyone can use after paying a
licensing fee. Apple is a big Arm customer, as are all of the big
tech companies that make their own chips.
Amazon, Google and Microsoft each are estimated to operate
millions of servers in globe-spanning networks of data centers for
their own use and to rent out to their millions of cloud-computing
customers. Even small improvements in performance and minute
reductions in the cost of powering and cooling chips become worth
the effort when spread across those vast technology empires.
Facebook Inc. also has explored working on its own chips.
David Brown, a vice president at AWS, said making its own
processors was an obvious choice for Amazon given the performance
gains it could achieve by dropping compatibility with older
software and other standard features of Intel chips that big data
center operators don't need.
"We were able to build a [chip] that's optimized for the cloud,
so we were able to remove a lot of stuff that's just not needed,"
he said. Amazon's chip-making efforts took off largely with its
acquisition of an Israeli company called Annapurna Labs about five
years ago.
Custom chips are gaining favor also in consumer products. Apple
this year started using its own processors in Macs after 15 years
of sourcing them from Intel. Google has incorporated its AI chip in
its Pixel smartphones.
So far, the lost business for traditional chip makers has been
modest, said Linley Gwennap, a chip-industry analyst. The market
share of all custom-made, Arm-based central processors is less than
1%, he said. Google's AI chips are by far the highest-volume
tech-company-designed processors, he said, comprising at least 10%
of all AI chips. Intel still supplies the vast majority of CPUs
that go into data centers.
The incumbents also aren't sitting idle in the race for cloud
and AI chip supremacy. Nvidia this year agreed to buy Arm in what
would be the chip industry's biggest acquisition. And Ian Buck, who
oversees Nvidia's data center business, said the company is working
closely with its largest customers to optimize the use of its chips
in their hardware setups.
Intel said around 60% of its server central processors sold to
large data center operators are customized to customers' needs,
often by switching off features of the chip that they don't
need.
And Intel has invested in AI processors and other specialized
hardware of its own, including buying Israel-based Habana Labs last
year for about $2 billion. AWS recently agreed to put Habana's AI
training chips in its data centers, as Amazon develops its rival
chips that it thinks will perform better when they appear next
year.
Remi El-Ouazzane, chief strategy officer for Intel's Data
Platforms Group, said Habana's chips at AWS could challenge Nvidia,
long the dominant player in an AI training market that he said
would be worth more than $25 billion by 2024. "It's a big net new
opportunity for Intel, and it's a very large market," he said.
-- For more WSJ Technology analysis, reviews, advice and
headlines, sign up for our weekly newsletter.
Write to Asa Fitch at asa.fitch@wsj.com
(END) Dow Jones Newswires
December 20, 2020 05:44 ET (10:44 GMT)
Copyright (c) 2020 Dow Jones & Company, Inc.
Microsoft (NASDAQ:MSFT)
Historical Stock Chart
From Feb 2024 to Mar 2024
Microsoft (NASDAQ:MSFT)
Historical Stock Chart
From Mar 2023 to Mar 2024