Nanashi comments on A pair of free information security tools I wrote - LessWrong

17 Post author: Nanashi 11 April 2015 11:03PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (97)

You are viewing a single comment's thread. Show more comments above.

Comment author: Nanashi 13 April 2015 03:25:43PM 3 points [-]

I don't really understand the question. Why would someone want to hide their private PGP key in their public PGP signature?

Comment author: ChristianKl 13 April 2015 04:13:28PM *  2 points [-]

You assume that the script can't leak the key if it's sandboxed.
For that to be true, it has to be impossible to hide the information from the private PGP key in the signature.

I did ask in security.stackexchange and according to it it's possible to steal the key.

5) doesn't guarantee security.

My own thinking on security is strongly influenced by the CCC hacker thinking. Seeing someone on stage holding a lecture on how he tracked Taiwanese money cards when he was for a few weeks there because the Taiwanese were just to stupid to implement proper security. There are a lot of cases where bad security failed and where the justification for thinking through security implications come from.

On the other hand you are right that the usability that comes out of that paradigm is lacking.

Comment author: Nanashi 13 April 2015 04:35:11PM 5 points [-]

Now that I understand what you are asking, yes, it is all but impossible to hide a private PGP key in the PGP signature which would successfully verify.

The "answer" described in that Stack Exchange post doesn't work. If you attempted that, the signature would not verify.

Comment author: ChristianKl 13 April 2015 04:42:41PM 2 points [-]

Now that I understand what you are asking, yes, it is all but impossible to hide a private PGP key in the PGP signature which would successfully verify.

How do you know?

Comment author: Nanashi 13 April 2015 06:33:39PM *  4 points [-]

A signed PGP message has three parts and thus only three places where additional information could be hidden. 1. The header 2. The message itself 3. The signature

The header is standardized. Any changes to the header itself (especially something as blatant as inserting a private key) would be enormously obvious, and would most likely result in a message that would fail to verify due to formatting issues.

The message itself can be verified by the author of the message. If anything shows up on this field that does not exactly match up with what he or she wrote, it will also be extremely obvious.

The signature itself, firstly, must be reproduced with 100% accuracy in order for the message to verify successfully. Any after-the-fact changes to either the message or the signature, will result in a message that does not verify successfully. (This is, of course, the entire purpose of a digital signature). Furthermore, the signature is generated algorithmically and cannot be manipulated by user input. The only way to change the signature would be to change the message prior to signing. However, as indicated above, this would be extremely obvious to the author.

Comment author: Pentashagon 14 April 2015 05:52:34AM 2 points [-]

https://tools.ietf.org/html/rfc4880#section-5.2.3.1 has a list of several subpackets that can be included in a signature. How many people check to make sure the order of preferred algorithms isn't tweaked to leak bits? Not to mention just repeating/fudging subpackets to blatantly leak binary data in subpackets that look "legitimate" to someone who hasn't read and understood the whole RFC.

Comment author: Nanashi 14 April 2015 11:15:27AM 5 points [-]

Remember that I did not invent the PGP protocol. I wrote a tool that uses that protocol. So, I don't know if what you are suggesting is possible or not. But I can make an educated guess.

If what you are suggesting is possible, it would render the entire protocol (which has been around for something like 20 years) broken, invalid and insecure. It would undermine the integrity of vast untold quantities of data. Such a vulnerability would absolutely be newsworthy. And yet I've read no news about it. So of the possible explanations, what is most probable?

  1. Such an obvious and easy to exploit vulnerability has existed for 20ish years, undiscovered/unexposed until one person on LW pointed it out?

  2. The proposed security flaw sounds like maybe it might work, but doesnt.

I'd say #2 is more probable by several orders of magnitude

Comment author: Pentashagon 15 April 2015 07:01:44AM 5 points [-]

Such an obvious and easy to exploit vulnerability has existed for 20ish years, undiscovered/unexposed until one person on LW pointed it out?

It's not a vulnerability. I trust gnupg not to leak my private key, not the OpenPGP standard. I also trust gnupg not to delete all the files on my hard disk, etc. There's a difference between trusting software to securely implement a standard and trusting the standard itself.

For an even simpler "vulnerability" in OpenPGP look up section 13.1.1 in RFC4880; encoding a message before signing. Just replace the pseudo-random padding with bits from the private key. Decoding (section 13.1.2) does not make any requirements on the content of PS.

Comment author: Nanashi 15 April 2015 11:45:06AM 2 points [-]

Thank you by the way for actually including an example of such an attack. The discussion between ChristianKI and myself covered about 10 different subjects so I wasn't exactly sure what type of attack you were describing.

You are correct, in such an attack it would not be a question of trusting OpenPGP. It's a general question of trusting software. These vulnerabilities are common to any software that someone might choose to download.

In this case, I would argue that a transparent, sandboxed programming language like javascript is probably one of the safer pieces of "software" someone can download. Especially because browsers basically treat all javascript like it could be malicious.

Comment author: Pentashagon 18 April 2015 03:47:16AM 2 points [-]

In this case, I would argue that a transparent, sandboxed programming language like javascript is probably one of the safer pieces of "software" someone can download. Especially because browsers basically treat all javascript like it could be malicious.

Why would I paste a secret key into software that my browser explicitly treats as potentially malicious? I still argue that trusting a verifiable author/distributor is safer than trusting an arbitrary website, e.g. trusting gpg is safer than trusting xxx.yyy.com/zzz.js regardless of who you think wrote zzz.js, simply because it's easier to get that wrong in some way than it is to accidentally install an evil version of gpg, especially if you use an open source package manager that makes use of PKI, or run it from TAILS, etc. I am also likely to trust javascript crypto served from https://www.gnupg.org/ more than from any other URL, for instance.

In general I agree wholeheartedly with your comment about sandboxing being important. The problem is that sandboxing does not imply trusting. I think smartphone apps are probably better sandboxed, but I don't necessarily trust the distribution infrastructure (app stores) not to push down evil updates, etc. Sideloading a trusted app by a trusted author is probably a more realistic goal for OpenPGP for the masses.

Comment author: ChristianKl 15 April 2015 02:23:33PM -2 points [-]

The core question isn't "how safe is X" but "what safety gurantees does X make" and "does X actually holds it's promises".

A decently used software downloaded from sourceforge is more trustworthy than unknown code transferred unencrypted over the internet.

Projects like Tor go even beyond that standard and provide deterministic builds to allow independent verification of check sums to make sure that you really are running the code you think you are running.

It's a general question of trusting software.

In this case trusting software that travel unencrypted through the internet. It's a quite easy principle to not trust code that travels unencrypted to do anything. It's really security 101. Don't trust unencrypted communiction channels.

Yes, there might be times when you violate that heuristic and don't get harmed but good security practice is still "Don't trust unencrypted communiction channels".

The idea of saying: "Well I don't have to trust the unencrypted communiction channels because I can do my fancy sandboxing, shouldn't come up." It's not how you think in crypto. In this case, the sandboxing doesn't work.

You could have said: "This is just a fun project, don't put any important private keys into it." You didn't but started arguing that your system can do more than it can.

The fact that you made that promises as laxly makes the belief in the iPhone app providing what it claims also doubtful. Key issues:

1) Do you make sure that the real image never get's written into SDD storage? (There's no way to trustworthy delete files in SDD storage)
2) Do you got the entropy production really right?
3) Do you really provide no traces in the final image?
4) No other bugs that make the crypto fail?

Given the 101 issues with the other project and the way you present it, why should someone trust that you handled those questions well?

Comment author: Pentashagon 15 April 2015 05:42:36AM *  2 points [-]

NOTE: lesswrong eats blank quoted lines. Insert a blank line after "Hash: SHA1" and "Version: GnuPG v1".

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Secure comment
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1
iQgEBAEBAgduBQJVLfkkxqYUgAAAAAAKB1NzZWNyZXRAa2V5LS0tLS1CRUdJTiBQ
R1AgUFJJVkFURSBLRVkgQkxPQ0stLS0tLXxWZXJzaW9uOiBHbnVQRyB2MXx8bFFI
WUJGVXQ5WU1CQkFESnBtaGhjZXVqSHZCRnFzb0ErRnNTbUtCb3NINHFsaU9ibmFH
dkhVY0ljbTg3L1IxZ3xYNFJURzFKMnV4V0hTeFFCUEZwa2NJVmtNUFV0dWRaQU56
RVFCQXNPR3VUQW1WelBhV3ZUcURNMGRKbHEzTmdNfG1Edkl2a1BJeHBoZm1KTW1L
YmhQcTBhd3ArckFSU3BST01pMXMvWUtLRWEweUdYaFN6MG1uZkYrZ3dBUkFRQUJ8
QUFQL1MrRjBWdkxzOW5HZWZjRFNpZ0hyRjNqYXAvcCtSNTArNGdDenhuY3djelBJ
dXR5NU1McFF5NHMxQVZ2T3xNcDZrZFFDV2pVUXdWZTc4WEF3WjNRbEh5dkVONDdx
RDZjNVdOMGJuTGpPTEVIRE9RSTNPQi9FMUFrNzlVeXVRfFQ0b21IVWp5MlliVWZj
VnRwZWJOR3d4RkxpV214RW1QZG42ZGNLVFJzenAzRDdFQ0FOSWxYbWVTbXR4WFRO
REp8REFrOUdoa1N6YnoyeFladmxIekZHb0ltRmU4NGI5UGZ5MEV1dHdYUFFmUUl0
VTBGTExxbkdCeStEWnk2MmpUc3xTMDlWbkpFQ0FQV21kZXhkb1ZKSjJCbUFINHE2
RlRCNXhrUnJMTzNIRk1iZU5NT2Z2T2ducy9Fa2czWjJQcnpHfG43RGdVU1FIZTFp
UHJJODJ0VmJYYXR6RE1xMnZ3OU1DQUt5SmtONnVzUFZkVHF5aVFjMDN6anJWMUNu
YkNrK1h8WVNtekpxcFd0QzVReWlycUp3ODlWQ2dBaDh4YlorWnI0NlY2R3VhdkdD
azdPbGIzQnE1a2V4ZWU2YlFMY0dWdXxkR0Z6YUdGbmIyNkl1QVFUQVFJQUlnVUNW
UzMxZ3dJYkF3WUxDUWdIQXdJR0ZRZ0NDUW9MQkJZQ0F3RUNIZ0VDfEY0QUFDZ2tR
YTJuMThBNWR2N0owVmdQNkFzdjBrS1ZWb050ZE5WSklHWThLN2I2L1l0ZWlJNFpa
NUJtL2YzUFF8WUpCRVVGWTljV1E2TVlZWFFlYm9TWHN1amN2cWJJMkpERFZ5dDFR
SCtXdk00dFhiNmdmaGp1a2hobmxaTUNnSnx0eXp1aHdZWHloZGVaMFZmb0hOeUxP
WHQyL1VvWCtsdVd4aWhkN1Exd2IrNjljVDV1V1IrYVEwK3h6SXJpVUdlfFBReWRB
ZGdFVlMzMWd3RUVBTXU4bWc1cmZMNERnNE5TaHNDc2YyQkd2UnJhZGRDcmtxTk40
ckNwNkdCUXBGQ018MVJldGIwYURQSkhsbWpnaWdOUzBpQTgvWXdyUGx0VktieW9r
S2NXZklmYTlmNjE1SmhwNHM3eEFXSUlycGNwaHxPdjlGakRsUldYd09PbXFBYzB5
dVV4WjN2Z2JERUZPWGRuQWk2ZDJDV0Y5a1B5UTlQbG5zL3gxcGtLS0xBQkVCfEFB
RUFBL29DMmsrTWwzbGdybXMvVnlsOGl5M01GYWJTT0hBMmpYWE9oRDhDQlptenQ0
MWF5ZzRMSXlvNnQ0aGl8bHBvZWpScDJ0VmNaRE9TQWVKV3BHT2k0Nkt3T1g1VXdW
bUI4ZldTbTJobHZxbWJ0ckNWUGUzZGQzZGVCMlM2RXxsTW5qa0YxWWtDYVl5ZGZo
Mi9BQ2lpT1RrNGZPREdzdVh1eU9jKytQSUwxVllxMVJjUUlBemk2bzZFMVhYTnpV
fEJmMUs3clZ2N3luMVJBRmZ1aWkrOFA1OGNtWnVheld0WVA0bTlVNTdLNjhHN0lH
QTRINUNYa1pTS1A0bDdTWHR8ZWQ2b01vZmlVd0lBL1Bhc2hqUnJXSUVBSDk4bEJR
aXdISmZWUlBsR1R6YU92Q0I3TXYyamZIdnlCR0lvTkF0aXx1ZXByT0VTMHZUNysy
eklaU201ei9rTG03UytzV3RNbjZRSUFrend6bTdRRFhLbjNiSm9BUEgvL2dOdWlY
NHRkfFNlSHJSNTJUTmhmTzJqTEZKU040K1pjMktnTkNDYVlzQ0haUEkrc214YWQ1
YU1BeG5qN3JXRlNSWTV2RmlKOEV8R0FFQ0FBa0ZBbFV0OVlNQ0d3d0FDZ2tRYTJu
MThBNWR2N0wrZ2dQL1hVN3IzR1I2bVRsanA5SVBHQXJ2aEVhNHxRZlBSbWIzWEly
ekJBVVR0Ti9KZXA1cFVUcno0N1pQcHdkckJnZnFvOXUweDgwUCtKdlY4azR0MGpX
c09nUlFyfDQrazhMRTFMSVBFbTl2Q2h0aVd4V2Z6eGNUSUF6ZXdhN20vZ2VscU1S
aGJibVNLeGdZNkhUV2pVYml6Q3ZsQit8Z0Q5UGRMNjU4RThUQkZxSlliUT18PU1l
VEl8LS0tLS1FTkQgUEdQIFBSSVZBVEUgS0VZIEJMT0NLLS0tLS18AAoJEGtp9fAO
Xb+yVvEEAJkkFIaHkFJ6OTKrKge/7+C9Vn+IkoMIq1bqsyyhClMVnuqiAG6fhqzv
glYeeVxtqYac9ecSzIbuszaHckcdA/q2onyeXjW1nfaBy2EdsxGuwCxvr+ac17v2
HhSKU/BXYMNRm7vLjDnRq99ON5+1F6IwY7rmuMJZuVOYEqPBdaTs
=FDzi
-----END PGP SIGNATURE-----

Output of gpg --verify:

gpg: Signature made Tue 14 Apr 2015 10:37:40 PM PDT using RSA key ID 0E5DBFB2
gpg: Good signature from "pentashagon"
gpg: WARNING: This key is not certified with a trusted signature!
gpg: There is no indication that the signature belongs to the owner.
Primary key fingerprint: B501 B12E 5184 8694 4557 01FC 6B69 F5F0 0E5D BFB2

Output of gpg -vv --verify:

gpg: armor: BEGIN PGP SIGNED MESSAGE
gpg: armor header: Hash: SHA1
:packet 63: length 19 - gpg control packet
gpg: armor: BEGIN PGP SIGNATURE
gpg: armor header: Version: GnuPG v1
:literal data packet:
mode t (74), created 0, name="",
raw data: unknown length
gpg: original file name=''
:signature packet: algo 1, keyid 6B69F5F00E5DBFB2
version 4, created 1429076260, md5len 0, sigclass 0x01
digest algo 2, begin of digest 56 f1
hashed subpkt 2 len 4 (sig created 2015-04-15)
hashed subpkt 20 len 1893 (notation: secret@key=-----BEGIN PGP PRIVATE KEY BLOCK-----|Version: GnuPG v1||lQHYBFUt9YMBBADJpmhhceujHvBFqsoA+FsSmKBosH4qliObnaGvHUcIcm87/R1g|X4RTG1J2uxWHSxQBPFpkcIVkMPUtudZANzEQBAsOGuTAmVzPaWvTqDM0dJlq3NgM|mDvIvkPIxphfmJMmKbhPq0awp+rARSpROMi1s/YKKEa0yGXhSz0mnfF+gwARAQAB|AAP/S+F0VvLs9nGefcDSigHrF3jap/p+R50+4gCzxncwczPIuty5MLpQy4s1AVvO|Mp6kdQCWjUQwVe78XAwZ3QlHyvEN47qD6c5WN0bnLjOLEHDOQI3OB/E1Ak79UyuQ|T4omHUjy2YbUfcVtpebNGwxFLiWmxEmPdn6dcKTRszp3D7ECANIlXmeSmtxXTNDJ|DAk9GhkSzbz2xYZvlHzFGoImFe84b9Pfy0EutwXPQfQItU0FLLqnGBy+DZy62jTs|S09VnJECAPWmdexdoVJJ2BmAH4q6FTB5xkRrLO3HFMbeNMOfvOgns/Ekg3Z2PrzG|n7DgUSQHe1iPrI82tVbXatzDMq2vw9MCAKyJkN6usPVdTqyiQc03zjrV1CnbCk+X|YSmzJqpWtC5QyirqJw89VCgAh8xbZ+Zr46V6GuavGCk7Olb3Bq5kexee6bQLcGVu|dGFzaGFnb26IuAQTAQIAIgUCVS31gwIbAwYLCQgHAwIGFQgCCQoLBBYCAwECHgEC|F4AACgkQa2n18A5dv7J0VgP6Asv0kKVVoNtdNVJIGY8K7b6/YteiI4ZZ5Bm/f3PQ|YJBEUFY9cWQ6MYYXQeboSXsujcvqbI2JDDVyt1QH+WvM4tXb6gfhjukhhnlZMCgJ|tyzuhwYXyhdeZ0VfoHNyLOXt2/UoX+luWxihd7Q1wb+69cT5uWR+aQ0+xzIriUGe|PQydAdgEVS31gwEEAMu8mg5rfL4Dg4NShsCsf2BGvRraddCrkqNN4rCp6GBQpFCM|1Retb0aDPJHlmjgigNS0iA8/YwrPltVKbyokKcWfIfa9f615Jhp4s7xAWIIrpcph|Ov9FjDlRWXwOOmqAc0yuUxZ3vgbDEFOXdnAi6d2CWF9kPyQ9Plns/x1pkKKLABEB|AAEAA/oC2k+Ml3lgrms/Vyl8iy3MFabSOHA2jXXOhD8CBZmzt41ayg4LIyo6t4hi|lpoejRp2tVcZDOSAeJWpGOi46KwOX5UwVmB8fWSm2hlvqmbtrCVPe3dd3deB2S6E|lMnjkF1YkCaYydfh2/ACiiOTk4fODGsuXuyOc++PIL1VYq1RcQIAzi6o6E1XXNzU|Bf1K7rVv7yn1RAFfuii+8P58cmZuazWtYP4m9U57K68G7IGA4H5CXkZSKP4l7SXt|ed6oMofiUwIA/PashjRrWIEAH98lBQiwHJfVRPlGTzaOvCB7Mv2jfHvyBGIoNAti|ueprOES0vT7+2zIZSm5z/kLm7S+sWtMn6QIAkzwzm7QDXKn3bJoAPH//gNuiX4td|SeHrR52TNhfO2jLFJSN4+Zc2KgNCCaYsCHZPI+smxad5aMAxnj7rWFSRY5vFiJ8E|GAECAAkFAlUt9YMCGwwACgkQa2n18A5dv7L+ggP/XU7r3GR6mTljp9IPGArvhEa4|QfPRmb3XIrzBAUTtN/Jep5pUTrz47ZPpwdrBgfqo9u0x80P+JvV 8k4t0jWsOgRQr|4+k8LE1LIPEm9vChtiWxWfzxcTIAzewa7m/gelqMRhbbmSKxgY6HTWjUbizC vlB+|gD9PdL658E8TBFqJYbQ=|=MeTI|-----END PGP PRIVATE KEY BLOCK-----|)
subpkt 16 len 8 (issuer key ID 6B69F5F00E5DBFB2)
data: [1024 bits]
gpg: Signature made Tue 14 Apr 2015 10:37:40 PM PDT using RSA key ID 0E5DBFB2
gpg: using PGP trust model
gpg: Good signature from "pentashagon"
gpg: WARNING: This key is not certified with a trusted signature!
gpg: There is no indication that the signature belongs to the owner.
Primary key fingerprint: B501 B12E 5184 8694 4557 01FC 6B69 F5F0 0E5D BFB2
gpg: textmode signature, digest algorithm SHA1

I ran the exported (unencrypted) private key through tr '\n' '|' to get a single line of text to set, and created the signature with:

gpg -a --clearsign --sig-notation secret@key="exported-secret-key-here" -u pentashagon

Let me know if your OpenPGP software of choice makes it any more clear that the signature is leaking the private key without some sort of verbose display.

Comment author: itaibn0 14 April 2015 02:57:26PM 2 points [-]

I've never seen it stated as a requirement of the PGP protocol that it is impossible to hide extra information in a signature. In an ordinary use case this is not a security risk; it's only a problem when the implementation is untrusted. I have as much disrespect as anyone towards people who think they can easily achieve what experts who spent years thinking about it can't, but that's not what is going on here.

Comment author: Nanashi 14 April 2015 04:55:03PM 3 points [-]

Let's assume you CAN leak arbitrary amounts of information into a PGP signature.

  1. Short of somehow convincing the victim to send you a copy of their message, you have no means of accessing your recently-leaked data. And since that is extremely unlikely, your only hope is to view a public message the user posts with their compromised signature. Which leads to....

  2. That leaked data would be publicly available. Anyone with knowledge of your scheme would also be able to access that data. Any encryption would be worthless because the encryption would take place client-side and all credentials thus would be exposed to the public as well. Which brings us to....

  3. Because the script runs client-side, it also makes it extremely easy for a potential victim to examine your code to determine if it's malicious or not. And, even if they're too lazy to do so...

  4. A private key is long. A PGP signature is short. So your victim's compromised signature would be 10x longer than the length of a normal PGP signature.

So yes, you all are correct. If I had malicious intent, I could write an attack that 1. could be immediately exposed to the public by any person with programming knowledge, 2. provides an extremely obvious telltale sign to the victim that something malicious is going on, and 3. doesn't actually provide me any benefit.

Comment author: ChristianKl 14 April 2015 05:21:00PM -2 points [-]

That leaked data would be publicly available. Anyone with knowledge of your scheme would also be able to access that data.

That's often the case with backdoors.

Any encryption would be worthless because the encryption would take place client-side and all credentials thus would be exposed to the public as well.

Did you understand the point of private-public key crypto?

Because the script runs client-side, it also makes it extremely easy for a potential victim to examine your code to determine if it's malicious or not. And, even if they're too lazy to do so...

I doubt anyone would bother to examine the code to a sufficient level to find security flaws. Especially since the code seems a bit obfuscated.

How long did it take people to find out that Debian's crypto was flawed? RSA?

A private key is long. A PGP signature is short. So your victim's compromised signature would be 10x longer than the length of a normal PGP signature.

That just means that it takes 10 signed messages to leak all data. Maybe it bit more because you have to randomly pick one of 10 slots. Maybe a bit less because you can do fancy math.

Comment author: Pentashagon 15 April 2015 06:24:19AM 0 points [-]
  1. Short of somehow convincing the victim to send you a copy of their message, you have no means of accessing your recently-leaked data.

Public-key signatures should always be considered public when anticipating attacks. Use HMACs if you want secret authentication.

  1. That leaked data would be publicly available. Anyone with knowledge of your scheme would also be able to access that data. Any encryption would be worthless because the encryption would take place client-side and all credentials thus would be exposed to the public as well.

You explicitly mentioned Decoy in your article, and a similar method could be used to leak bits to an attacker with no one else being able to recover them. We're discussing public key encryption in this article which means that completely public javascript can indeed securely encrypt data using a public key and only the owner of the corresponding private key can decrypt it.

  1. Because the script runs client-side, it also makes it extremely easy for a potential victim to examine your code to determine if it's malicious or not. And, even if they're too lazy to do so...

Sure, the first five or ten times it's served. And then one time the victim reloads the page, the compromised script runs, leaks as much or all of the private key as possible, and then never gets served again.

  1. A private key is long. A PGP signature is short. So your victim's compromised signature would be 10x longer than the length of a normal PGP signature.

An exported private key is long because it includes both factors, the private exponent, and the inverse of p mod q. In my other comment I was too lazy to decode the key and extract one of the RSA factors, but one factor will be ~50% of the size of the RSA signature and that's all an attacker needs.

Comment author: ChristianKl 13 April 2015 06:55:28PM *  -2 points [-]

Furthermore, the signature is generated algorithmically and cannot be manipulated by user input.

"Algorithmically" doesn't mean that there exactly one way to create a valid signature. Hash functions quite often have collisions.

Comment author: Nanashi 13 April 2015 08:08:53PM *  6 points [-]

I'm downvoting this comment because it's misleading.

First of all, no one has ever found an SHA-2 hash collision yet. Second of all, the chances of two SHA-2 hashes colliding is about 1 in 1 quattuorvigintillion. It's so big I had to look up what the number name was. It's 1 with 77 zeroes after it. We're talking universe-goes-into-heat-death-before-it-happens type odds. Only under the most absurd definition of "quite often" could anyone ever reasonably claim that a cryptographic hash function like SHA-2 "quite often" has collisions.

Comment author: dxu 13 April 2015 08:13:10PM 0 points [-]

It's 1 with 77 zeroes after it.

Not that I disagree with your general point, but... 77 isn't a multiple of 3.

Comment author: Nanashi 13 April 2015 08:18:02PM 1 point [-]

Why does it need to be a multiple of 3?

(SHA-2 = 2^256 = 1*10^77)

Comment author: dxu 13 April 2015 08:22:49PM 3 points [-]

You wrote that the odds were 1 in 1 quattuorvigintillion. I was under the impression that all "-illion"s have exponents that are multiples of 3.