TPU (Tensor Processing Unit) is a family of ASICs Google announced in 2016, custom-built to crunch deep-learning matrices at very high efficiency. A systolic-array architecture lets it run matrix multiplications with much higher density and lower energy than a comparable GPU, and it powers internal Google workloads — Search, Translate, YouTube and Gemini — at huge scale. Generations like TPU v4, v5e, v5p and Trillium are also exposed externally through Google Cloud, with particularly tight integration into the JAX ecosystem. TPUs are the most credible non-NVIDIA option at the frontier, though their ecosystem and developer audience remain smaller than CUDA's.
MEVZU N°124ISTANBULYEAR I — VOL. III
Glossary · Intermediate · 2016
TPU (Tensor Processing Unit)
Google's custom ASIC accelerator family designed specifically for deep-learning workloads.
- EN — English term
- TPU (Tensor Processing Unit)
- TR — Turkish term
- TPU — Tensor İşlem Birimi