All of CillianSvendsen's Comments + Replies

Interesting post! A relevant post might be Eliezer's Harder Choices Matter Less.

After learning more about the math behind quantum mechanics, I'm pretty sure indeterminacy doesn't work that way. :P

From the Azkaban chapters:

From what Amelia heard, Dumbledore had gotten smarter toward the end of the war, mostly due to Mad-Eye's nonstop nagging; but had relapsed into his foolish mercies the instant Voldemort's body was found.

Dumbledore's lesson from his room isn't that you needed to shut up and multiply, it's that war is so terrible that you must be willing to sacrifice anything so prevent it from occurring again. He prioritized people's lives to stop a war, but he's not willing to sacrifice anyone except to prevent more violence. Dumbledore never... (read more)

Very interesting. When I was 10, a friend and I got together to "crack" the problem of indeterminacy. We also came up with this hypothesis (I fail to recall how).

(On a tangentially related note: After reading a couple of wikipedia articles, we decided we were wrong and moved onto the hypothesis that the universe was a giant simulation, and quantum indeterminacy was floating-point error.)

0TheAncientGeek
What problem? If you want to predict, indetrrminacy is inconvenient, but why should the universe be convenient for humans?
0KnaveOfAllTrades
What're your current thoughts on this? Also, you and your friend sound awesome.

I found this very interesting because when I was 12, I read a very similar book to the Childcraft book you mention, and also vowed never to do drugs, drink, give in to peer pressure, act angry and emotional, etc. Except later on, when I became a teenager, my guardians took this behavior as evidence of my "abnormality" and tried very hard to quash it out of me, even going so hard as to push me to drink and "fit in". Unfortunately they've been partially successful - at very least, I felt very resentful at them for a very long time.

Much like the NSA is considered ahead of the public because their cypher-tech that's leaked is years ahead of publicly available tech, the SI/MIRI is ahead of us because the things that are leaked from them show that they've figured out what we've just figured out a long time ago.

2Bugmaster
Wait, is NSA's cypher-tech actually legitimately ahead of anyone else's ? From what I've seen, they couldn't make their own tech stronger, so they had to sabotage everyone else's -- by pressuring IEEE to adopt weaker standards, installing backdoors into Linksys routers and various operating systems, exploiting known system vulnerabilities, etc. Ok, so technically speaking, they are ahead of everyone else; but there's a difference between inventing a better mousetrap, and setting everyone else's mousetraps on fire. I sure hope that's not what the people at SI/MIRI are doing. You linked to DES and SHA, but AFAIK these things were not invented by the NSA at all, but rather adopted by them (after they made sure that the public implementations are sufficiently corrupted, of course). In fact, I would be somewhat surprised if the NSA actually came up with nearly as many novel, ground-breaking crypto ideas as the public sector. It's difficult to come up with many useful new ideas when you are a secretive cabal of paranoid spooks who are not allowed to talk to anybody. Edited to add: So, what things have been "leaked" out of SI/MIRI, anyway ?

I don't think that's a forgone conclusion. After all, there seem to be many proposals on how to get around this problem that individuals compete each other. For example, there's Eliezer's idea of using humanity's coherent extrapolated voalition to guide the AI. I also don't think that its in anyone's advantage to have hostile AI, that no one will try to bring about explicitly hostile AI on purpose, and that anyone sufficiently intelligent to program a working AI will probably recognize the dangers that AI contain.

Yes, humans will fight amongst each other ... (read more)