FLOPS, FLoating-point Operations Per Second, is a measure of computer performance. Instructions per second is another measure of computer performance, but FLOPS are particularly useful in scientific fields and distributed computing projects which make heavy use of floating-point calculations, such as the Great International Marsenne Prime Search, the Search for Extraterrestrial Intelligence, and Folding@home.
Referring to 1 FLOPS as a singular "flop" is an etymological backformation, and some confusion can result when differentiating between operations per second, and floating-point operations in general, which are also simply called "flop."
10^24 floating-point operations per second are a yottaFLOPS, abbreviated YFLOPS.
FLOPS are calculated with the following equation:
FLOPS = sockets × cores / socket × clock × flops / cycle, where "flop" refers to the total number of floating-point operations able to be performed by the specific CPU being used at the time.
At the time of this writeup, the world's most powerful computers operate in tens of petaFLOPS (10^15 FLOPS).
Iron Noder 2016, 7/30