> To avoid ambiguity I usually use `flop/s` but not everyone likes that :)
Flop/s makes no sense and is outright wrong. The whole point of flops is to express how many floating point operations are required by an algorithm. How many operations are performed per second is a property of the hardware you're using to run an implementation of the algorithm.
You want to express computational complexity in terms of floating point operations.
Well, yeah, I use it in the context of floating-points-operations-per-seconds. That's the most common use in my field. I was replying to the parent comment. No need to use this kind of tone.
The number of operations per second varies by at least an order of magnitude on the same hardware depending on the GEMM algorithm you use (reference or Goto), and you quote the performance of the implementation in terms of FLOP/s knowing the number of FLOPs required by the computation. That makes sense to people who implement and measure these things.
Flop/s makes no sense and is outright wrong. The whole point of flops is to express how many floating point operations are required by an algorithm. How many operations are performed per second is a property of the hardware you're using to run an implementation of the algorithm.
You want to express computational complexity in terms of floating point operations.