Deepseek rattles off the US-led AI ecosystem with its latest models. nvidia’s Market capitalization. While sector leaders are working on fallout, small AI companies are seeing opportunities to expand into Chinese startups.
Several AI-related companies told CNBC that the emergence of Deepseek is not a threat, but a “big” opportunity for them.
“Developers want to replace the expensive and closed models of Openai with open source models like the Deepseek R1…” said Andrew Feldman, CEO of Celebrus Systems, an artificial intelligence chip startup. It’s there.
The company competes with Nvidia’s graphics processing units to provide cloud-based services through its own computing cluster. Feldman said the release of the R1 model produced one of the biggest spikes ever in demand for the service.
“R1 shows that [AI market] Growth is not controlled by a single company. There is no hardware and software moat in the open source model,” Feldman added.
Open source refers to software that makes source code freely available on the web. Unlike competitors such as Openai, Deepseek’s models are open source.
Deepseek also claims that the R1 inference model is comparable to the best of American technology despite being run at a low cost and being trained without cutting-edge graphics processing units, but industry watchers and competitors question these claims.
“Low prices can help drive global adoption, like PC and the internet market. The AI market is on a similar secular growth path,” Feldman said.
Inference chip
Deepseek can increase adoption of new chip technology by accelerating the AI cycle from training to the “inference” phase, chip startups and industry experts said.
Inference refers to the act of using and applying AI to make predictions or decisions based on new information, rather than building or training a model.
“Simply put, AI training is about building a tool or algorithm, while inference is about actually deploying this tool for use in real applications.”
Nvidia holds the dominant position in GPUs used for AI training, but many competitors see it Abundance of expansion in the “inference” segment. They promise to be more costly and more efficient.
Although AI training is very computationally intensive, inference could work with stronger chips programmed to perform narrower ranges of tasks, Lee added.
Many AI chip startups told CNBC that demand for inference chips and computing is growing as clients adopt and build on DeepSeek’s open source model.
“[DeepSeek] “We’re committed to providing a range of services to our customers,” said Sid Sheth, CEO of AI Chip Startup D Matrix.
“The wide range of small capacity models catalyze the era of inference because of widespread availability,” he told CNBC, adding that he recently saw a surge in interest from global customers seeking to speed up inference plans. Ta.
Robert Wachen, co-founder and COO of AI Chipmaker Etched, said dozens of companies have contacted startups since Deepseek released Reasoning Models.
“The companies are 1738909599 We will shift spending from training clusters to inference clusters,” he said.
“DeepSeek-R1 proves that inference time calculations are now [state-of-the-art] All major model vendors and thinking approaches are not cheap. Scaling these models to millions of users requires more and more computing power. ”
Jevon’s Paradox
Analysts and industry experts agree that Deepseek’s performance is a boost to the AI reasoning and the broader AI chip industry.
“Deepseek’s performance appears to be based on a set of engineering innovations that improve training costs while significantly reducing inference costs.” Report From Bain & Company.
“In bullish scenarios, continuous improvements in efficiency lead to cheaper inferences and spurs greater AI adoption.”

This pattern explains Jevon’s paradox. This is the theory that demand will increase due to cost reductions in new technology.
Financial services and investment firm Wedbush said in a research note last week that it continues to expect AI use across businesses and retail consumers around the world to drive demand.
Speaking to CNBC’s “Fast Money” last week, Sunny Madra, the COO of GROQ, who develops chips for AI reasoning, says there is room for fewer players to grow as the overall demand for AI increases. I suggested.
“The world needs more tokens, so [a unit of data that an AI model processes] Nvidia is unable to provide adequate tips for everyone, giving the market an opportunity to sell more aggressively,” Madra said.