Sundar Pichai, CEO of Google, has been preaching an “AI-first” approach to their mobile go-to-market strategy for a while now. If we look inside the hardware designs of Pixel phones from the Pixel 2 to the current Pixel 4, what we found puzzling is the inclusion of a discrete NPU chip. These neural processing units or NPUs are used for on-device mobile AI applications such as computational photography, speech recognition, network optimization all without user’s input. However, Google’s discrete NPU design is the exception and not the norm of most modern smartphone designs. So why did the Pixel have to add a separate chip just to do AI when most handset OEMs are satiated with existing solutions from SoC providers such as Qualcomm or Mediatek? The answer may lie in Google’s vision of putting AI into edge devices like phones and how that edge compute will ultimately provide valuable machine learning to train their AI – creating a virtuous cycle of progress and evolution of AI capabilities.
There has been increasing chatter in the industry that Google will be producing their own mobile SoC for devices such as Pixel smartphones with the help of Samsung. While this news is still categorized as a rumor, it does corroborate the design evolution of those NPU designs within Pixel phones. While Pixel 2 and 3 includes the same Intel (Movidius) NPU, the Pixel 4 features a completely different Samsung-made NPU with a large 4Gb of DRAM memory. Clearly, Google is evolving and improving the NPU chip design over time and if there are any truth to the rumor we can easily draw the conclusion that Google wants to customize the machine learning (ML) capabilities on mobile devices. What would drive the search giant towards building their own SoC? All signs points to their “AI-first” ambitions of mobility and edge-compute.
The inclusion of the NPU chip in Pixels 2, 3 and 4 does not appear to have raised eye-brows when it was first introduced in 2017. Most AI or ML are performed in the background without needing user intervention so most people are oblivious to the additional silicon in the phone. In fact, Google’s differentiated services like on-device translation or transcribing runs off of this chip and does not need a network connection to perform the translations and transcriptions. So, why invest so heavily in a chip just to perform task such as realtime translation? Likely, having the NPU hardware and Machine Learning, Google is better able to understand the common interactions of the user in order to “teach” their AI and evolve the mobile experience. Basically, this “trojan horse” NPU chipset is another mean for Google to gather information from users. Obviously, little has been mentioned about this design and possible motive due to the privacy issues surrounding it. However, to create better AI, a large set of data is needed to “train” the AI. Therefore, there is a rhyme and reason for the inclusion of the NPU in previous Pixel devices. So that leads us to ask why Google would want to customize a SoC and basically create their own mobile chip? The priviledge of having a bespoke tensor processing design to put Google in an uniquely propreitory position to perform machine learning at these edge devices. The partnership with Samsung makes sense due to the fact that Samsung created the last NPU chip in the Pixel 4 which by all accounts appears “custom”.
But of course, all this is conjecture on the part of this authors and in danger of treading into conspiracy theory territory. However, it does present a valid theory as to Google’s odd choice to include a NPU chip in their Pixel phones and how that strategy matches up with the overall “AI-first” mission of the larger Google organization – to make it easy for users to find what they are looking for as well as a means to train their sophisticated AI in order to improve the mobile experience.