TPU v4 supercomputer, which is claimed to have surpassed TPU v3 by 2.1 times, comes with SparseCores, dataflow processors to expedite models that depend on embeddings by five to seven times while only using 5% of die area and power

Google_Headquarters_in_Ireland_Building_Sign

Google reveals new information about its TPU v4 supercomputer. (Credit: Outreach Pete/Wikimedia Commons)

Google has revealed new information regarding the TPU v4 supercomputer it has been used in training artificial intelligence (AI) models, claiming that it is faster and delivers more power efficiency than similar systems from Nvidia.

The Alphabet company built its own custom, application-specific integrated circuits (ASICs) called Tensor Processing Units (TPUs), which are used for speeding up machine learning workloads.

TPU v4 is Google’s fifth domain specific architecture (DSA) and its third supercomputer for the evolved machine learning (ML) models.

Each TPU v4 supercomputer comes with SparseCores, dataflow processors to expedite models that depend on embeddings by five to seven times while only using 5% of die area and power.

TPU v4 is said to have surpassed TPU v3 by 2.1x. It also enhances the performance/Watt by 2.7 times.

Google said that the TPU v4 supercomputer is four times bigger at 4096 chips and hence overall about 10 times faster.

Besides, TPU v4 along with optical circuit switches (OCS) flexibility supports large language models.

The tech major claimed that for systems of similar sizes, the supercomputer is nearly 4.3x to 4.5x faster than the Graphcore IPU Bow and is 1.2x to 1.7x faster while consuming 1.3-1.9 times less power than the Nvidia A100.

Google fellow Norm Jouppi and Google distinguished engineer David Patterson wrote in a blog post about the system as: “Circuit switching makes it easy to route around failed components.

“This flexibility even allows us to change the topology of the supercomputer interconnect to accelerate the performance of an ML (machine learning) model.”

The supercomputer has been deployed in a data centre in Mayes County, Oklahoma since 2020, reported Reuters.

TPU v4s in warehouse scale computers of Google Cloud utilise nearly three times less energy and produce about 20 times less carbon dioxide emissions than present DSAs in a typical on-premises data centre, said Google.