GPU (Graphics Processing Unit) was originally designed for rendering, but it has become the undisputed default compute device for deep learning. The reason is simple: GPUs run highly parallel operations like the matrix multiplications inside neural networks across thousands of cores at once, while a CPU would handle the same workload far more slowly and expensively. NVIDIA's CUDA ecosystem became the de facto standard from around 2007 onwards, and today A100, H100 and H200 form the backbone of the LLM stack. AMD (MI300) and specialised accelerators (TPU, NPU, Cerebras, Groq) offer alternatives, but ecosystem gravity still sits firmly on the NVIDIA side.
MEVZU N°124ISTANBULYEAR I — VOL. III
Glossary · Beginner · 2007
GPU
A processor that runs massive parallel computations — the workhorse of deep learning.
- EN — English term
- GPU
- TR — Turkish term
- GPU — Grafik İşlem Birimi