INTERVIEW| ‘Practical use cases to boost enterprise AI PC adoption’

AI will redefine PC capability by enhancing performance and enabling complex tasks like natural language processing, data analysis

Vinay Sinha
Vinay Sinha, corporate vice- president, India Sales, AMD.

With artificial intelligence (AI) PCs entering the consumer market this year, the big question is: When will they reach the enterprise market? Vinay Sinha, corporate vice- president, India Sales, AMD, cites numerous benefits like improved computing performance, reduced latency, and cost savings from running AI workloads locally. “Shipments of AI-capable and AI-powered PCs have just commenced in India. With the number of practical use cases for AI expected to grow dramatically over the next 6-18 months, companies will begin adopting AI PCs soon,” he told Sudhir Chowdhary in
an interview. Excerpts:

How will AI affect the overall PC market?

With rising demand, AI is poised to redefine capabilities of the PC market by enhancing performance and enabling complex tasks like natural language processing, image recognition, and data analysis. This evolution will drive innovation, making dedicated AI engines and accelerators standard features in modern PCs, thus transforming user interaction.
A dedicated AI engine, like Ryzen AI powered by AMD’s XDNA architecture, accelerates AI workloads without straining system resources. It uses minimal power, allowing the CPU (central processing unit) and GPU (graphics processing unit) to handle other tasks. This minimises the overall performance impact .

When will AI PCs hit the enterprise market?

SMBs and enterprises are actively looking to develop or deploy AI services in their workflows and run those services locally. This, especially with most independent software vendors either integrating AI into their products or planning to do so. With the number of practical use cases for AI expected to grow dramatically over the next 6-18 months, companies will have to begin adopting AI PCs now to ensure their systems can run AI workloads when they intend to actually make use of new applications. AMD’s Ryzen AI NPU (neural processing unit), with a robust software stack, offers benefits such as low latency, high performance, and power efficiency, enhancing corporate efficiency, decision-making, and productivity.

What are the different types of AI workloads?

There are two types of AI workloads: Training and inference. Training involves creating models, such as image recognition or natural language processing, typically running best on GPUs. A good use case for this would be image recognition by feeding the AI model pictures of cats and non-cats to create a model that can identify cats. Inference applies trained models to real-world scenarios, running on CPUs, GPUs, or NPUs. Use cases include smart assistants and image processing, with NPUs expected to handle more inference workloads as performance improves. It’s important to note that as NPUs continue to improve, more inference workloads are expected to run optimally and efficiently on NPUs.

How will the customers benefit from a cloud versus local AI processing?

Cloud AI processing offers powerful hardware and scalability for large workloads, ensuring rapid processing. On the other hand, local AI processing reduces latency, allowing faster task completion without data transmission delays, it also offers enhanced privacy and security when AI tasks run locally on the PC. The choice depends on customer needs, with both options evolving to combine strengths for future hybrid services.

How do these advancements strengthen AMD’s position in AI computing?

Advancements in AI computing, especially with AMD’s Ryzen AI processors, reinforce AMD’s market leadership. AMD was the first to introduce an AI accelerator, or NPU, in an x86 mobile processor in early 2023 and was also the first to introduce an NPU for desktop processors at the start of 2024. The combination of a dedicated AI engine, AMD Radeon graphics engine, and powerful Ryzen processor cores enables AI capabilities that cater to a wide range of AI workloads. This positions AMD as a key player in providing AI solutions optimised for both cloud and local processing, offering high performance for AI training and inference tasks.

What are the company’s plans to build up the AI momentum in India?

AMD is promoting AI PCs in India through three market approaches: direct marketing to customers, marketing with channel partners, and joint marketing with OEM partners such as Lenovo, HP, Dell, Acer, ASUS and MSI. Globally, collaborations with ISVs like Topaz Labs, DaVinci Resolve, and Adobe Premiere Pro also enhance user experiences, offering AI-accelerated video editing and optimised photo and video software with AMD processors and Ryzen AI.

AMD also has strategic collaborations with multiple industry leaders to drive the AI momentum from the cloud to enterprise data centres. While Oracle plans to offer Oracle Cloud Infrastructure bare metal compute solutions featuring the latest Instinct MI300X accelerators, Microsoft Azure is the first public cloud provider to deploy AMD Instinct MI200 accelerators for large scale AI training.

Get live Share Market updates, Stock Market Quotes, and the latest India News and business news on Financial Express. Download the Financial Express App for the latest finance news.

This article was first uploaded on July one, twenty twenty-four, at thirty-one minutes past four in the afternoon.