Always-on endpoint AI: Balanced system design

Traditional AI hardware design is a matter of careful compromises: compute, memory, and bandwidth must be balanced to avoid becoming bottlenecks. This is complicated because there is no such thing as an average ‘AI workload.’ In reality, neural networks are highly diverse in how they tax these resources, requiring system designers to choose a ‘sweet spot’ compromise or design a niche product.

Endpoint AI introduces power as a further constraint. Power utilization is most impacted by memory bandwidth, followed by compute power utilization.

Source link

Leave a Comment

Your email address will not be published. Required fields are marked *