Mobile and IoT chip giant Qualcomm on Wednesday announced an expansion of its artificial intelligence developer runtime offering, called Qualcomm AI Stack portfolio, which the company claims will address the "Unique needs of AI for each and every business line" that Qualcomm's chips serve, including mobile phones, connected infrastructure in Internet of Things installations, and connected vehicles.
The software incorporates existing tools, such as the company's Neural Processing SDK, while allowing developers to "leverage that same work across every product."
"We are extending one technology roadmap across all our different business lines," said Qualcomm vice president of product management Ziad Asghar in a media briefing. "The step function here is to be able to take your work and port it across those different business lines."
The software stack has a focus on power efficiency, such as how to quantize workloads, and portability of code across the many hardware targets of Qualcomm and its customers.
On the efficiency side, "the performance per watt that we are showing is just outstanding in terms of how much work we can do for a given amount of power," said Asghar. "Even for the same hardware, as we optimize the software, we are able to get 30% to 40% better performance in some cases."
Asghar said the ability to handle many contexts is the main virtue.
"Some of the environments are very focused on their end markets - on auto or cloud," noted Asghar.
"This is the best of class of experience for what you could get out of our hardware, that's the unique advantage," he said.
"This will allow you go get the most optimized and best experience from an efficiency perspective and an accuracy perspective, so, we think this is a huge differentiation for us and our partners."
A goal of the technology is for customers to be able to adjust performance of machine learning for different criteria, said Asghar.
"For this application, latency is very critical, you have to have the ability to optimize for that, whereas for that application, latency is most important, or accuracy - you can make those trade offs with the stack we are offering, and then leverage that work everywhere."