close
close

Qualcomm’s boom highlights AI’s shift to the edge

A year ago, Qualcomm was not a well-received technology company. In fact, as recently as October last year, the company’s shares were teetering at a 52-week low. The long-time creator of mobile technologies and owner of valuable intellectual property was plunged into a crisis caused by slow growth in China and the poor situation on the personal computer and smartphone market.

Skip to now. There is magic called artificial intelligence. In just six months, Qualcomm shares fell from a 52-week low to an all-time high as the market realized it had the components and technology to play in the artificial intelligence market as devices such as smartphones and PCs become key for delivery AI Inference – When the results of AI models are delivered to customers on devices. This is what many people in technology industries call the “edge” – devices connected to infrastructure.

The launch of Qualcomm’s new Snapdragon X series of AI inference chips has coincided well with changes in the PC and device markets, giving Qualcomm that boost. Qualcomm has also made a number of announcements with key partners such as Microsoft that are implementing its AI computing technology in consumer devices.

Enthusiasm for AI Edge

Qualcomm’s example shows how business media and Wall St. have begun to see the idea that the requirements for AI are perhaps broader than just delivering large language models (LLM) and chatbots from the cloud. There is also edge AI, private enterprise AI, and vertical AI.

The desire for computers powering artificial intelligence extends to billions of devices around the world, from cars to cameras, often referred to as the Internet of Things (IoT). Anything connected to infrastructure or networks will require more computing power and connectivity to run AI models.

What does this mean for the entire AI infrastructure? Our recent research and discussions with technology developers indicate that the conversation around AI infrastructure is about to change. I think in the next few years we will talk less about LLMs and chatbots and more about vertically oriented AI applications and infrastructure – and private AI for enterprises.

Chatbots are an attractive mass market, but they only address one segment – ​​consumer information. The closest analogue is the search market, where Google has an 80% to 90% share and generates quarterly revenues of approximately $80 billion. The current size of the search market is estimated at around $400 billion. Enterprise infrastructure and industrial technology markets are hundreds of billions larger.

The artificial intelligence market will go far beyond consumer information and chatbots. It also has a variety of applications in data analytics, robotics, healthcare, and finance – just to name a few. Many of these more specific vertical markets may not need an LLM at all, but more specific AI technologies, which may include small language models (SLM) or other specially designed AI processing software. They will need to deliver results – AI inference – to countless hardware platforms, from cars to medical devices.

“We have only scratched the surface of AI as it moves into vertical, private AI, edge and distributed cloud. “AI is more than LLM and SLM, and industry/domain-specific models will dominate new deployments beyond the big cloud players,” Mike Dvorkin, co-founder and CTO of cloud networking company Hedgehog, told me in a recent interview . “The opportunities are enormous and will require new thinking about infrastructure and how it is used.”

AI powers private AI and hybrid infrastructure

If Dvorkin, a former distinguished Cisco engineer, is right, the edge AI infrastructure market will be gigantic.

This conversation has come up in many discussions I’ve witnessed recently, with some technologists estimating that the AI ​​market could shift from 80% modeling and 20% inference to the other way around. Additionally, CIOs I listened to recently pointed out that a private AI model will be much more useful in specific industries, such as healthcare and finance, where enterprise customers may want to own as much of their own data and models as possible.

For this reason, the AI ​​wave will drive more diverse hybrid and multi-cloud architectures – including private clouds – as the demand for data, analytics and connectivity spreads across multiple infrastructures.

“We have a hybrid cloud model,” George Maddalino, Mastercard’s chief technology officer, said at a recent technology event hosted by The Economist in New York. “We have premium workloads and hyperscale workloads. You can see us moving from a bank data center to a hyperscale cloud to a cloud retailer. By default we go to a multi-cloud environment.”

Nizar Trigui, CTO at GXO Logistics, also picked up on the idea that AI application data connectivity will be commonplace and available in any location.

“Most of us are going through some kind of digital transformation,” Trigui said. “How do we create more value for customers? We create value from data in 1,000 warehouses around the world, digitally connected.”

The most important takeaway from Qualcomm’s recent growth is the enthusiasm for AI around the world – it means processing and inferring data wherever it is. This endeavor will not be limited to infrastructure or models owned solely by hyperscalers, but will spread broadly across enterprises, edge devices, and IoT.

(The author has no position in Qualcomm stock.)