Google Compares Its AI Supercomputer Chip to NVIDIA’s A100: “Up to 1.7x Faster While Using Up to 1.9x Less Power”

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,235
Points
83
Someone get Jensen out of bed. Google has published a scientific paper that details the latest version of its custom chip that is being strung together to create AI supercomputers, the Tensor Processing Unit (TPU), and apparently, it's better than the A100, one of NVIDIA's leading chips for AI, data analytics, and HPC. According to an excerpt from the paper, users will find not only 1.2x to 1.7x faster performance than the A100, but also superior efficiency, with claims of 1.3x to 1.9x less power usage being noted. TPU v4 is a platform of choice for the world's leading AI researchers and features industry-leading efficiency, Google said in a separate article that discusses what its engineers have achieved.

See full article...
 
While this version of ainchews up a lot of compute and ram it doesn't feel powerful in execution and it definitely doesn't feel like it can learn. Or I just have crappy AI engines to tinker with.
 
Become a Patron!
Back
Top