View all newsletters
Receive our newsletter - data, insights and analysis delivered to you

AMD will bake AI into its future chip designs

The chip designer says it is working to build artificial intelligence into all of its products including CPUs, GPUs, ASICS and FPGAs.

By Ryan Morrison

Chip design giant AMD will bake an artificial intelligence framework into all of its future chip designs. CEO Lisa Su said this was essential as the “full potential of AI can only be realised when it is available across a range of devices.”

AMD says it is baking AI into all of its product designs including CPUs and GPUS (Photo: Joseph GTK/Shutterstock)
AMD says it is baking AI into all of its product designs including CPUs and GPUS. (Photo by Joseph GTK/Shutterstock)

“AI is truly the most important megatrend for the future of tech,” Su declared in her keynote speech at the CES 2023 trade show, which took place in Las Vegas this week. “At its simplest AI leverages the power of high-performance computing to analyse and interpret massive amounts of data to make predictions on future outcomes.” This works across a range of business and consumer applications.

“To bring the right level of AI capability to all devices we need multiple compute engines and that means GPUs, CPUs and Adaptive Accelerators,” she explained. “We are one of the only companies in the world to have all of these engines.”

Part of the XDNA architecture, embedding AI from the start of the design process will allow for scalable artificial intelligence that can run from “PCs to intelligent endpoints to edge devices and into the cloud,” Su explained. She went on to say that it would also allow for a range of new functions including adaptability and power efficiency.

This has been made possible in part due to technology the company gained access to when it acquired Xilinx earlier this year including IP and talent that was focused on AI engines and software. These have allowed for designs that are more adaptable and intelligent, and can sit inside cloud and edge platforms, as well as smart devices.

The first AI-powered accelerator, Alveo V70, has been built on XDNA technology and will be previewed this year at CES. It can be plugged into existing services to accelerate applications including video analytics and recommendation engines, with 400 trillion operations per second of AI computing performance from 75 watts of power.

AI embedded in Ryzen 7040

That same core AI technology is also in the Ryzen 7040 laptop chip, also announced at CES. This is the first mobile x86 processor to come with an integrated on-chip AI engine, according to Su, and they’ve named that engine Ryzen AI. Ryzen AI can run four simultaneous AI streams with up to 12 trillion operations per second spread across its eight CPU cores and running at speeds of up to 5.2GHz.

Content from our partners
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape

This is an important time for artificial intelligence hardware, with rapidly growing large language models and image generation tools placing ever greater demands, as well as software requiring greater analysis power – from enterprise to consumer tools. AMD says users will be able to run massive models on the AI engines inside the Ryzen CPUs that would previously have required a GPU and many times the power.

Having access to the AI engine opens up a wider range of functions to the laptop user, which could include simple tools like reducing background blur, suppressing background noise and automatically framing videos during a Teams call or in Windows Studio Effects much more efficiently than previously possible on less power.

Windows 11 laptop users with a Qualcomm Arm-based chip already had access to this functionality as it had AI accelerators embedded that supported Windows Studio Effects, but this brings it to laptop x86 architecture.

AMD says it will also bring AI to its GPUs, ASICS and FPGAs to ensure its entire product range had artificial intelligence baked in from the start of the design process.

Qualcomm also made a big announcement at CES, with plans for the Snapdragon Satellite for Android, that will allow for two-way satellite messaging on Android devices. This brings the same SOS via Satellite feature Apple launched for the iPhone 14 to Android for the first time and is a partnership with Iridium satellite constellation. OEMs will be able to offer global coverage and work in the most rural and disconnected regions of the world regardless of weather conditions.

Read more: China has a $143bn semiconductor plan to beat US chip sanctions

Topics in this article : ,
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU