Inference Chips Market Growth Potential
Analysts predict rapid growth in the data center inference chips market as businesses integrate AI technologies into their products. Companies like Alphabet Inc's Google are investigating ways to control additional costs associated with AI implementation, with electricity being a significant expense.
Qualcomm's Cloud AI 100 Targets Power Conservation
Qualcomm leveraged its experience in designing chips for battery-powered devices to develop the Cloud AI 100, a chip aimed at minimizing power consumption. In image classification tests, the AI 100 outperformed Nvidia's H100, achieving 227.4 server queries per watt compared to Nvidia's 108.4 queries per watt.
Object Detection Comparison
Qualcomm also surpassed Nvidia in object detection, a technology useful for applications like analyzing retail store footage. Qualcomm's chips scored 3.8 queries per watt, while Nvidia's chips reached 2.4.

Nvidia Leads in Natural Language Processing
In natural language processing tests, Nvidia claimed the top spot in both absolute performance and power efficiency, achieving 10.8 queries per watt. Qualcomm ranked second with 8.9 queries per watt. Natural language processing is widely used in systems like chatbots.