The Information
Read the latest article from The Information. Subscribe today and save 25% on all of our business, tech and finance reporting.  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ 
Apr 19, 2026

Google is in talks with Marvell Technology to develop two new chips aimed at running AI models more efficiently, according to two people with direct knowledge of the discussions. One is a memory processing unit designed to work alongside Google’s tensor processing unit. The other is a new TPU built specifically for running AI models.

The moves underscore surging demand for inference chips that run AI powering commercial products such as autonomous agents. At its GTC conference in March, Nvidia released a chip designed to improve the efficiency of inference workloads. Called a language processing unit, the chip is built on technology Nvidia licensed from startup Groq for $20 billion.

Read the full article

Google in Talks With Marvell to Build New AI Chips for Inference

By Qianer Liu

Related articles



Follow us
X
LinkedIn
Facebook
Threads
Instagram
Sent to fugol@nie.podam.pl | Manage your preferences or unsubscribe | Help
The Information · 251 Rhode Island Street, Suite 107, San Francisco, CA 94103