TPU- Google’s Way to Speed Machine Learning
Google announced its new product the Tensor Processing Unit better known as the TPU at its I/O Developers conference last month. The TPU is named after the TenserFlow software it uses for its machine learning programs.
Google CEO Sundar Pichai said it provides an order of magnitude better performance per watt than all the existing chips for machine learning tasks.The TPU could potentially speed up the machine learning processes consuming considerably less power.
Google has confirmed that the chip has been under development for 2 years. Pichai also revealed that the TPU chips were used in the AlphaGo computer that beat Lee Sedol, the World’s “GO” Champion, in a match that was widely publicized in March. This shows Google’s commitment to the statement “Great software shines the brightest with great hardware underneath”.
Bruce Daley, principal analyst for Boulder, Colo.-based tactical firm Tractica said that TPU can help fill the increasing gap observed in Moore’s law — which has long dictated that the number of transistors in a dense integrated circuit doubles about every two years. Daley also said that –“The fact that this product uses TenserFlow tells us that it has applications in machine learning and deep learning”.
Machine learning is used in a variety of applications like data analytics, translation software and also voice recognition. Google has said that the TPU has provided equivalent gains to moving the Moore’s law forward by three generations, or roughly seven years.
The TPU is tailored for machine learning applications; the chip is more tolerant of reduced computational precision which results in the requirement of fewer transistors per operation. This result in being able to squeeze in more operations per second in the silicon and using more sophisticated and powerful machine learning models and applying these models quickly so that we get more intelligent results with a much higher speed. A board with a TPU fits in a hard disk drive slot in Google’s data center racks.
Already Being Used
Machine learning provides the oomph factor to many of Google’s most loved applications. More than 100Google development teams are using machine learning for their work on Inbox Smart Reply, Street View, The RankBrain search result sorting system and many other applications. The TPU is now being used across Google’s cloud. This advancement has come at a time when more and more applications are being developed on cloud resulting in fewer concerns about hardware configuration and maintenance.
UrsHölzle, Google’s senior vice president for technical infrastructure said at the I/O conference that the TPU can augment the machine learning processes but that there are enough functions that require CPUs and GPUs that the new product is unlikely to replace them.
The TPU serves an example of how fast we have turned research into practice — From first tested silicon chip, Google’s team had the TPU up and running applications with speed within 22 days. While the pace at which hardware is getting upgraded in recent years the TPU itself could get replaced in the coming years. Machine learning has transformed how developers build intelligent applications and we are excited to see how TPU brings these possibilities to life.
Original article published at: http://techttalks.com/tpu-googles-way-speed-machine-learning/