Chipmaker giant Intel says its next-generation processor will be able to run generative AI chatbots locally, rather than having to send data out to the cloud. The company says this is more secure and will allow users to utilise AI without an internet connection. However, it could find itself trailing behind Qualcomm, which has already demonstrated running AI models locally on its Arm-based chips, and done so on a Windows laptop.
Since the launch of OpenAI’s ChatGPT in November last year, companies have looked to utilise the technology across applications. Microsoft is adding its own OpenAI-powered chatbot Copilot to Windows 11, although this currently runs in the cloud. Apple is using foundation AI models that run on the iPhone for transcription and voice modelling.
AI is big business, expected to be worth $667bn by 2030 with a CAGR of 47.5%. Its investment in AI has made Nvidia one of the most valuable companies in the world and every Big Tech company is working on AI models.
Intel says its upcoming Core Ultra laptop chip can run AI tools like Microsoft’s Copilot, Stable Diffusion and Meta’s Llama 2 model locally. During the announcement at its Innovation conference, the company showed laptops capable of generating a song in the style of Taylor Swift, running locally, and answering questions using natural language.
Core Ultra, codenamed Meteor Lake, is the first Intel chip to include an integrated neural processing unit (NPU) that the company says means it can run AI models in a more power-efficient way, and thus don’t require a large server to operate.
Integrated NPUs aren’t new. Apple has included them in its in-house M-series of chips for Mac and iPad and AMD has something similar in its Ryzen mobile semiconductors. Earlier this year, Qualcomm demonstrated Snapdragon chips capable of running Windows and AI models on a laptop.
“We see the AI PC as a sea change moment in tech innovation,” said Intel CEO Pat Gelsinger during his keynote at the conference. Due for release in December, Intel described Meteor Lake as the biggest architectural shift in generations, seeing it as the new foundation for the next wave of PC innovation. It is being seen as the “AI PC era”, a paradigm shift in computing power.
Most of the AI models for generating text, images, music and code are trained from massive datasets on GPUs in data centres. Intel has struggled to gain ground against Nvidia in this market but hopes to catch up through CPUs capable of running the models once trained. Intel says it is also building a supercomputer for Stability AI, the makers of Stable Diffusion, and hardware for Alibaba Group to run its own chatbot models.
Intel says its AI PC technology depends on the killer applications and demand for a more data-secure way to utilise AI. During the keynote, Gelsinger demonstrated running apps that utilise AI and other sports applications. He also showed off the next generation Lunar Lake processors which will come the end of next year and be capable of advanced image generation using models like Stable Diffusion.