Cloud TPU and Google I/0 2017

Google I/0 2017: Why we’re excited about Cloud TPU

Google I/O is arguably the biggest and most publicized dev event of the year. On May 17th, Google again made headlines all over the world, but most outlets focused on just the tip of the iceberg. Let’s see what 2017 has in store from us and why you should sign up for Cloud TPU (if you’re a ML dev, at least).

First off, Google Assistant is on its way to your home appliances ( the latest update brings the ability to schedule appointments through voice) recognition and support for a large number of devices.

As a natural continuation, Google Lens will use the camera from the smartphone to identify information and redirect you to the proper websites or apps. For example, you will be able to scan the back of your router and automatically sign in to your home wireless network – the camera will recognize and input the password. Even further, you will be able to scan a poster for a concert or event, and Google will send you directly to the website selling tickets.

Google Lens

These two pieces of news tie into the biggest announcement of them all (in our opinion at least): Google is extremely invested in Augmented Reality and Virtual Reality and is also investing heavy resources into machine learning. This brings us to Cloud TPU, the second generation of Tensor Processing Units to join the Google Cloud. OK, so CPU stands for Central Processing Unit (computer processors, at their most basic function), GPU for Graphics Processing Unit (think video cards like NVIDIA’s products)….so what is TPU?

Google’s Cloud Tensor Processing Unit  is a custom-built computer dedicated to machine learning, with astonishing processing power, which will be available for free for Machine Learning developers. How much power?

„These revolutionary Cloud TPUs were designed from the ground up to accelerate machine learning workloads. Each Cloud TPU provides up to 180 teraflops of performance, providing the computational power to train and run cutting-edge machine learning models. Cloud TPUs can help you transform your business or create the next research breakthrough.”

Google Cloud TPU

According to the official sign-up page, you can access powerful machine learning accelerators on demand, without creating your own datacenter. And all of this will come with 64 giga-bytes of ultra-high-bandwidth memory, plus the ability to connect to Cloud TPUS from custom VM types, in order to optimally balance memory, processor speeds and resources for your individual workloads.

This project is part of a new initiative called Google.ai, which also comprises TensorFlow and other applied AI.

You can find out more about Cloud TPU here and also sign up for the alpha. If you’re a developer, it’s worth investigating if you want to keep up with trends.

Of course, Cloud TPUs are mostly beneficial to Machine Learning applications, but Google Cloud offers accelerators for almost any particular use case. Our own ClusterCS allows you to manage your servers in a single location – a central dashboard, where you can control servers from Google Cloud to Rackspace or Azure, so this is partly why we found the news so interesting and share-worthy.

We’re not going to complain about the fact that most news outlets chose to report Android reaching 2 billion devices instead of the Cloud TPU announcement. Well, yes we just complained a bit, but we are simply just too excited about this news.

How about you? What caught your eye at Google I/O 2017? Join the discussion below or on our Facebook/Twitter.