Skip to main content

The First Place You'll See Embedded AI is Smartphones

embedded AI in smartphone

Without a doubt, the first major implementation of embedded AI at the application level will be on your smartphone. Now of course, smartphones have a persistent network connection that provides internet access, and that means app queries can go back to the cloud for processing in an AI application. However, with the projected growth in AI, it does not make sense to add more traffic to networks and implement huge numbers of inference tasks in the data center.

This is one reason that companies are looking to bring more of that AI processing on to the end device, and chipmakers are responding with new products. As an example, the new Snapdragon processor claims to make smartphones even smarter, with AI capabilities available on the user's handset. These new processors are a game changer for end user adoption of AI applications and access to AI capabilities in their everyday lives. So with this being the case, what do these chips really look like and what capabilities do PCBs need to implement them?

Embedded AI Takes the Right Chipsets

Obviously, without the right processor, you will never be able to implement AI capabilities in a timely and low-power manner. GPUs get all the attention because their internal architecture supports highly parallelized computation that is used in tensorial mathematics. These are the gears that make neural networks tick, and tensor processing units (TPUs) essentially implement the same kind of architecture for low latency, low power inference.

To build these devices, more companies have been taking their chip design tasks in-house and contracting their fabrication out to an external foundry. Intel’s 2022 announcement of a new foundry services division should be no surprise as more foundry capacity can be allocated to support these kinds of products. Companies that want access to AI capabilities without the form factor blow-up of an accelerator chip need to put that processing block into a chip package.

Heterogeneous integration is the approach that is allowing companies to build these types of processors for their products. These products appear in different types of advanced packaging:

  • Package-on-package (PoP)

  • Fan out wafer level packaging (FOWLP)

  • 3D stacked processors

  • Integrated fan out (InFO)

embedded AI in smartphone

Advanced packaging styles.

Why will smartphones be the first group of products that include embedded AI capabilities? Aside from the network traffic and bandwidth reasons mentioned above, there are a host of apps that can use embedded AI and provide higher performance, especially if they can only do it on the user’s handset. Taking low-compute tasks and performing them on the handset is a natural approach to enabling a range of new AI-capable apps without increasing network traffic.

There's also the economics of implementing embedded AI at scale. Handsets are produced in huge volume, and the processors used in handsets provide more than enough volume to make up for the design and prototyping costs of new chipsets with AI processing blocks. They are the lowest risk path forward to scaling an AI-capable chipset. Even if the phones with these chipsets flop with consumers and there is low growth in new application usage, the production scale more than makes up for the development cost. The IP can then be ported over to a new product that targets a different use case.

Which Packaging Style Wins?

Now that we live in the age of heterogeneous integration, the industry has tended to use different types of packages for different applications. For a significant amount of time, package-on-package was the main approach to mobile processor design. It is likely that this will remain the case going forward, depending what IP is included in the main CPU die.

embedded AI in smartphone

There is very little room inside a smartphone for an AI accelerator chip. The optimal place to put an AI compute block is on the main processor die.

The packaging style you're likely to see in these newer products depends on whether the AI block will need to live on its own die or if it will be integrated onto another die as IP. Obviously from the above image, there is very little room for an additional component.

No matter which approach to packaging wins out, companies designing these components for their products will have to work closely with a foundry and OSAT to determine the right packaging style and component size. Chip packages in smartphones need to have the highest possible reliability over broad temperature ranges and must withstand occasional mechanical shocks. The chip form factor will then drive what happens on the PCB, which today has largely moved to a stacked substrate-like PCB approach with embedded components.

As more companies take control of their chip design operations alongside advanced PCB design, multi-disciplined design teams will need the best set of tools for interchange between PCB layout and chip designs. No matter what you need to design, you can build it with the best set of PCB design features with MCAD support in Allegro PCB Designer from Cadence. Only Cadence offers a comprehensive set of circuit, IC, and PCB design tools for any application and any level of complexity.

Subscribe to our newsletter for the latest updates. If you’re looking to learn more about how Cadence has the solution for you, talk to our team of experts.