Especially in discussions about AI, the abbreviation "FLOPs" is being used for both "floating point operations per second" (a measure of computational power) and "floating point operations" (a measure of total computations, and equivalent to the previous term times seconds). This is ambiguous and confusing. For clarity, I propose people avoid this specific abbreviation and instead use the alternatives of "FLOP" (for floating point operations) and "FLOP/s" (for floating point operations per second).
FWIW, I am ~100% confident that this is correct in terms of what they refer to. Typical estimates of the brain are that it uses ~10^15 FLOP/s (give or take a few OOM) and the fastest supercomputer in the world uses ~10^18 FLOP/s when at maximum (so there's no way GPT-3 was trained on 10^23 FLOP/s).
If we assume the exact numbers here are correct, then the actual conclusion is that GPT-3 was trained on the amount of compute the brain uses in 10 million seconds, or around 100 days.