Making AI Systems Faster and More Efficient with Google Edge TPUs

Aishwarya Singh 07 May, 2019 • 2 min read

Overview

  • Google has designed an AI chip, called Edge TPU, for enterprise applications
  • This will be able to automate quality control checks factories
  • Edge TPU is available as a development kit for customers

 

Introduction

In 2016, Google unveiled Tensor Processing Units, or TPUs are they’re more commonly known – chips specifically designed for Google’s TensorFlow framework. Taking this a step further, the tech behemoth has now introduced Edge TPU, a small artificial intelligence accelerator that enables machine learning jobs in IoT (Internet of Things) devices.

The Edge TPU is designed to perform tasks that the machine learning algorithm are trained for. For example, it will be able to recognize an object in a picture. This part of ‘preforming task’ which the algorithm is trained for, is known as ‘inference’. While the Edge TPUs are designed to perform the inference, Goggle’s server based TPUs are responsible for training the algorithm.

As the team mentioned in their blog post, the newly designed chips are actually meant to be used in various enterprise jobs such as for automating the check for quality in factories. If you were hoping to see it in your smart devices, sorry to disappoint! Currently, the hardwares that are used send the data over the internet for analysis. These hardwares will now be replaced with the devices which will eliminate this process. This means lesser downtime and faster results.

This is not the first attempt at creating AI chips for on-device tasks. Other companies like ARM, Qualcomm and Mediatek have their own AI accelerators and of course Nvidia’s GPUs are one of the best in the business. Then how is Google different from any of these?

Here is the interesting part – with Google, one can store the data on Google Cloud, train their algorithms using TPUs, and then carry out on-device inference using the new Edge TPUs. Google can ensure that all the processes mentioned run as efficiently and smoothly as possible so as to make it a seamless experience for the end user.

Google is also making the Edge TPU available as a development kit for customers. The idea is to let the customers test the hardware’s capability and see how it fits into their existing product catalog.

 

Our take on this

Google continues to stamp it’s authority in the IoT space. Having on-device machine learning is expected to be comparatively more secure and provide faster results. Also, for the end user, storing data, training algorithms and performing the required task(s) will all become more simpler as they will not have to switch to different platforms. Google’s Cloud offerings, TPU and Edge TPU will cover all of this!

 

Subscribe to AVBytes here to get regular data science, machine learning and AI updates in your inbox!

 

Aishwarya Singh 07 May 2019

An avid reader and blogger who loves exploring the endless world of data science and artificial intelligence. Fascinated by the limitless applications of ML and AI; eager to learn and discover the depths of data science.

Frequently Asked Questions

Lorem ipsum dolor sit amet, consectetur adipiscing elit,

Responses From Readers

Clear