If you’ve followed Google’s announcements at I/O 2016, one stand-out from the keynote was the mention of a Tensor Processing Unit, or TPU (not to be confused with thermoplastic urethane). I was hoping to learn more about this TPU, however Google is currently holding any architectural details close to their chest.

More will come later this year, but for now what we know is that this is an actual processor with an ISA of some kind. What exactly that ISA entails isn't something Google is disclosing at this time - and I'm curious as to whether it's even Turing complete - though in their blog post on the TPU, Google did mention that it uses "reduced computational precision." It’s a fair bet that unlike GPUs there is no ISA-level support for 64 bit data types, and given the workload it’s likely that we’re looking at 16 bit floats or fixed point values, or possibly even 8 bits.

Reaching even further, it’s possible that instructions are statically scheduled in the TPU, although this was based on a rather general comment about how static scheduling is more power efficient than dynamic scheduling, which is not really a revelation in any shape or form. I wouldn’t be entirely surprised if the TPU actually looks an awful lot like a VLIW DSP with support for massive levels of SIMD and some twist to make it easier to program for, especially given recent research papers and industry discussions regarding the power efficiency and potential for DSPs in machine learning applications. Of course, this is also just idle speculation, so it’s entirely possible that I’m completely off the mark here, but it’ll definitely be interesting to see exactly what architecture Google has decided is most suited towards machine learning applications.

Source: Google

Comments Locked

39 Comments

View All Comments

  • ddriver - Friday, May 20, 2016 - link

    It is not apparent that cooling will be passive. What's apparent is this was designed for server rackmount 1u chassis which do not leave enough height to mount fans on top of radiators, instead they blow air through the entire chassis.

    The card radiator is designed to be cooled by such an airflow, and looking at the power circuit components and the radiator itself, it is around 150 watts.
  • sciwizam - Friday, May 20, 2016 - link

    According to Urs Holzle, Google's head of datacenters,

    "..he said that Google would be releasing a paper describing the benefits of its chip and that Google will continue to design new chips that handle machine learning in other ways. Eventually, it seems, this will push GPUs out of the equation. “They’re already going away a little,” Hölzle says. “The GPU is too general for machine learning. It wasn’t actually built for that.”

    http://www.wired.com/2016/05/googles-making-chips-...
  • Qwertilot - Friday, May 20, 2016 - link

    It'll be fun to see where it all goes :)

    That big Pascal thing does go a fair way towards this of course, and this is actually a potentially big enough market that you could imagine them ultimately doing more or less dedicated chips for it. Probably other people too.
  • surt - Friday, May 20, 2016 - link

    Seems pretty clear that Google now has a superintelligent AI in operation. Humorous watching all the puppets dance for it, but also scary of course. Will be interesting to see whether it maintains a beneficent stance once it has sufficient robotic independence in the physical world.
  • ddriver - Friday, May 20, 2016 - link

    Like with everything else that came out of technology, it will be employed into turning humans into more efficient milking cattle. Google have collected all sorts of data for decades, now they want the hardware to efficiently make something out of it. They intend to make on it so much money that their advertising business will look like a joke next to it.
  • ddriver - Friday, May 20, 2016 - link

    "Seems pretty clear that Google now has a superintelligent AI in operation"

    I highly doubt they have that. What they have is exabytes of people's personal, private and public data and the intent to comb through that for anything anyone is willing to pay for. The big winners will be google, governments, banks and corporations, and their gains will come from the one possible source - the general population.
  • Murloc - Friday, May 20, 2016 - link

    which is enjoying internet services and the economic and personal benefits of these like no generation before, and it's all for this price.
    And they willingly submit to this.
    They can use a dumbphone and duckduckgo and run their e-mail server or whatever if they have a problem.
  • ddriver - Saturday, May 21, 2016 - link

    which is saving you the effort of basic thought to enable and ease the loss of that ability completely, judging by your comment you are already there, at this point you need them to tell you what to do and believe :)
  • ddriver - Saturday, May 21, 2016 - link

    you will really be enjoying it when some mediocre, 5$ worth of ai makes your job obsolete and renders you entirely useless, you will have such a blast enjoying internet services and the economic and personal benefits while you dig through the trash to survive, who knows maybe they will launch a "dumpster digger" service to inform the likes of you of the location and content of various dumpsters
  • Murloc - Friday, May 20, 2016 - link

    it's as easy as pulling a lever to turn it off.

Log in

Don't have an account? Sign up now