Still, there are a number of necessary operations at the assembly/machine level to perform a flop, and presumably much of the same operations are used when computing a hash. At the very least, you have to move around memory, add values, etc. There should be level of commensurably in that respect, right?
Unfortunately, there isn't; in most architectures, the integer and bitwise operations that SHA256 uses and the floating-point operations that FLOPs measure aren't even using the same silicon, except for some common parts that set up the operations but don't limit the rate at which they're done. A typical CPU will do both types of operations, just not with the same transistors, and not with any predictable ratio between the two performance numbers. A GPU will typically be specialized towards one or the other, and this is why AMD does so much better than nVidia. An FPGA or ASIC won't do floating point at all.
But certainly all of these components can do floating point arithmetic, even if it requires special programming. People could use computers to add decimals before floating-point specialized subsystems existed. And you wouldn't say that an abacus can't handle floating point arithmetic "because it has no mechanism to split the beads".
There seems to be quite a bit of a Bitcoin interest around here, with several articles about it already: [1 2 3 4 5 6 7]
I propose that links and generic Bitcoin comments should be posted here, instead of making a new discussion thread for each interesting article about the subject.