FLOPs (Floating-point Operations per Second) measure how many floating-point operations a piece of hardware can perform per second — a classic compute-power metric inherited from HPC. In AI it is usually quoted in TFLOPS (tera) or PFLOPS (peta); an H100 reaches roughly 1 PFLOPS at FP16 and around 2 PFLOPS at FP8. But theoretical FLOPs and real-world performance rarely line up perfectly; memory bandwidth, kernel occupancy and the software stack often become the actual bottleneck. That is why modern AI infrastructure discussions weight MFU (Model FLOPs Utilization) — 'how much of the paper number you can actually achieve' — at least as heavily as the headline FLOPs.
MEVZU N°124ISTANBULYEAR I — VOL. III
Glossary · Beginner · 1980
FLOPs
Floating-point operations per second — the classic metric for raw compute power.
- EN — English term
- FLOPs
- TR — Turkish term
- FLOPs (Saniyedeki Kayar Nokta İşlemi)