Kevin Krewell, Principal Analyst, Tirias Research
8/3/2018 00:01 AM EDT
Is Google prepared to sell (and support) its new machine learning chips?
At Google Cloud Next, the company stepped it up its ASIC developments with its own machine learning accelerator chip for edge computing based on its TPU design.
Google seemed to imply that it will also sell the Edge TPU directly to companies that want to build intelligent edge devices. Google could be trying to leverage its internal designs to enter new markets, or this could also be an attempt to reduce internal ASIC costs by spreading development costs over larger chip volumes above and beyond what the company needs for internal uses.
Selling the chips also builds a larger ecosystem for developers. Google initially plans to offer the Edge TPU through a couple of do it yourself (DIY) boards.
Google created a series of kits for DIY creators that want to add machine learning called AIY, for artificial intelligence yourself. Initially, the projects leveraged Raspberry Pi boards. The last kit was a camera module designed to show how to build vision machine learning projects. In that project, Google used the Movidius chip from Intel as a dedicated vision inference accelerator.
The two newest developer kits use the Edge TPU chip for inference acceleration. At this time, Google has not released pricing or availability for the kits or the Edge TPU. Technical details are also sketchy. We know the TPU supports 8- and 16-bit integer math, but no power or performance numbers have been revealed.
Google seems to have some ambitions in chip design and recently hired John Hennessy as chairman and added David Patterson to the TPU team. Both men are the authors of the definitive textbook on computer architecture.
The TPU and TPU 2, were designed strictly for internal cloud datacenter needs. Now with the Edge TPU, Google is bringing the TPU architecture to the wider market. The company can now claim it can build a complete edge to cloud ecosystem for machine learning based on the Google framework. Google’s site has a quote from Chinese automotive supplier Xee, which endorsed the Edge TPU for its connected car platform.
There are still a lot of challenges to Google getting into the chip business. First is that the company needs to build a distribution channel and support ecosystems that can deal with hundreds or thousands of customers.
Second, is the risk for the embedded systems designer. Google has a history of killing products and services that the company feels have outlived usefulness and redirecting resources to more interesting projects. What if Google end-of-life’s the Edge TPU without a migration plan? How would that impact third party designs that count of the Edge TPU?
Third, it puts Google in direct competition with existing partners like Intel/Movidius. New ML chip companies may be reluctant to work with Google for fear Google will steal their concepts and build a competing chip.
I think it would make more sense for Google to transfer responsibility for sales to a third-party chip company, like NXP or Microchip, that has the sales and logistics to support the product for embedded design. Many embedded companies already have the experience in the channel selling to embedded designers and would welcome adding a machine learning accelerator product. Google seems to want to own the customer relationship to make sure developers follow Google frameworks, but I’m not convinced Google is prepared to support an embedded product of the entire lifespan.
Another alternative for Google is to release the Edge TPU design as open source RTL. The company could put it up on GitHub or donate it to the RISC-V Foundation. This would increase the likelihood that the TPU design would get integrated directly into microcontrollers and SoCs.
The bottom line is that I think any embedded designer should be cautious in thinking about using Google as a silicon supplier.
— Kevin Krewell is a principal analyst at Tirias Research.