Nvidia (NVDA) dominates the market for artificial intelligence chips used in internet data centers and cloud computing. But Nvidia stock faces more pressure in the coming year, as 2020 is sure to bring more competition in AI chips from processor giant Intel (INTC) and others.
XLook for big changes in the fast-growing market for the powerful computer chips that process AI software for emerging business applications, analysts say. The AI chip battleground pits Nvidia versus Intel, which gobbled up another AI startup, Habana Labs, for $2 billion in mid-December. There's also a wave of startups such as Graphcore, Mythic, Wave Computing and SambaNova Systems.
Meanwhile, Amazon.com (AMZN) in November strutted out a homegrown AI chip at its re:Invent conference. Amazon Web Services plans to use the chip, dubbed "Inferentia," for its own cloud computing services. Amazon joins Alphabet's (GOOGL) Google, Microsoft (MSFT) and Facebook (FB) in designing special-purpose AI chips.
A shift in the type of artificial intelligence software used in data centers and cloud computing services will open the door to more competition, analysts say.
"Going forward, 2020 is likely to see many more companies starting to ship and volumes ramp as competition for Nvidia heats up," said Tractica analyst Aditya Kaul.
Nvidia Leadership Challenged As AI Chip Market Booms
Intel and some startups are revving up production of AI chips that could undermine Nvidia's leadership. For that reason, Nvidia stock is one artificial intelligence stock to watch.
With roots in computer gaming, Nvidia continues to improve high-end graphics processors that gave it an opportunity to target the AI market. However, an ongoing shift to a new type of AI software could create demand for special-purpose accelerators, analysts say.
There's a big market in AI chips for cloud and corporate data centers. Tractica sees the AI chip market quadrupling to $6.7 billion in 2022, from $1.66 billion in 2018.
AI technology uses computer algorithms. The software programs aim to mimic the human ability to learn, interpret patterns and make predictions.
Industries most often deploy "machine learning" AI. Machine learning systems use huge troves of data to train algorithms to recognize patterns and make predictions.
Nvidia dominates in selling AI chips for training purposes. But Advanced Micro Devices (AMD) seems to be gaining traction in that market, analysts say. Nvidia's cloud computing customers process vast amounts of data to train AI algorithms and create computing models.
Inferencing Creates Demand For New AI Chips
Many emerging business applications of AI software involve the use of what is known as "inferencing." With inferencing, AI algorithms handle less data but generate responses more rapidly.
Data centers mainly use AI chips for training. But industry watchers expect to see inferencing AI chips more widely deployed in automotive, robotic and industrial Internet of Things applications.
"Training and inference are intimately related to one another," said David Kanter, one of the chairs of MLPerf, an industry consortium of tech firms that measures performance of machine-learning workloads. "Training is the process of sifting through data to collect insights and build a model. Inference is using that model operationally. Inference could be translating text, recommending web pages or products, finding pedestrians on the street, and many other things."
Kanter added: "Broadly speaking, the market for inference solutions is much bigger and spans from the tiniest devices like smartphones or voice assistants to massive data centers and autonomous vehicles. Roughly speaking there are an order of magnitude more companies developing inference processors than training processors — easily over a hundred. It's also very much a greenfield market. The dominant solution in data centers is a standard server processor, but in the next year or two we will see a huge number of accelerators coming to market from many different companies."
With cloud computing expected to evolve into "edge computing," inferencing AI chips will play a bigger role, analysts say.
Edge computing deploys data processing, storage and networking close to sensors and where other data originate. The goal is to process and analyze data locally in real time rather than send it to faraway data centers in the internet cloud.
Along with Intel and others, more than 70 companies are pushing into the AI chip market, says Tractica.
Nvidia Stock: Data Center Boost From Inferencing?
"Ultimately we think Nvidia will find it harder to dominate inference, especially outside the data center," UBS analyst Timothy Arcuri said in a recent report to clients.
When reporting third-quarter earnings in November, Nvidia said inferencing applications are creating demand for its AI chips. Nvidia pointed to "conversational speech AI," a term for speech recognition and natural language processing — as the early driver for inferencing AI chips.
Customer service chat bots are one application.
"(Nvidia's) Q3 datacenter growth was driven by a combination of AI training and AI inferencing," Raymond James analyst Chris Caso said in a recent note to clients. "While datacenter revenue was down year over year due to lower system sales, both training and inferencing posted record sales. Inferencing growth is particularly important since bulls had been waiting for inferencing to gain traction for two years."
Nvidia stock holds a strong position as inferencing apps create more demand for artificial intelligence chips. Expect plenty of competition for Nvidia, though, says 650 Group analyst Alan Weckel. As it stands, companies mostly use Nvidia's so-called GPUs, or graphics processing units, for AI training apps.
But, "There are a ton of AI ASIC companies out there" targeting data center servers and other markets, said Weckel.
New Types Of AI Chips In Development
ASICs, or application-specific integrated circuits, work as custom chips for a particular task. One plus for ASICs: They generally require less power than general-purpose AI chips.
Amazon's Inferentia is one example of new AI ASICs coming to market. Intel, meanwhile, now works with Facebook in developing AI inferencing chips.
Another type of artificial intelligence chip expected to gain market share is field-programmable gate array, or FPGAs. Xilinx (XLNX) and Intel have special-purpose AI accelerators in development, analysts say. Startups Flex Logix and Achronix Semiconductor also have programmable AI chips in development.
"2020 will be an exciting year for AI chipsets," ABI Research analyst Lian Jye Su said in a report. "Several stealth startups are likely to launch programmable chipsets for data centers, while the emergence of new AI applications in edge devices will give rise to more ASICs dedicated for edge AI inference workloads."
Despite the rise in competition, Tractica's Kaul expects Nvidia to thrive for a while.
"Nvidia is still in a good position and is likely to dominate for some time because of their robust and flexible software platform and stickiness with AI developers," he said.
Follow Reinhardt Krause on Twitter @reinhardtk_tech for updates on 5G wireless, artificial intelligence, cybersecurity and cloud computing.
YOU MAY ALSO LIKE
Check Out IBD's Stock Of The Day
Best Growth Stocks To Buy And Watch: See Updates To IBD Stock Lists
These Technology Stocks Have A Spot In IBD Leaderboard
IBD Digital: Unlock IBD's Premium Stock Lists, Tools And Analysis Today
See Stocks Just Added To — And Cut From — IBD's Top Screens
The post Battle In AI Chips Heats Up As Nvidia Faces A Brigade Of Rivals appeared first on Investor's Business Daily.
No comments:
Post a Comment