Posts

Sorted by New

Wiki Contributions

Comments

"Yudkowsky founded the Machine Intelligence Research Institute, with the money of venture capitalist Peter Thiel, which focusses on the development of a benevolent AI. It should be benevolent, because it will have power over people, when it gets smarter than them. For the neoreactionaries, Intelligence is necessarily related to politics. based on the concept of human biodiversity, they believe that social and economical differences are caused and justified by a genetic difference in intelligence among ethnic groups. They reject the idea of a common human nature."

Oh, come on, that's a poorly-thought-out attack. "Yudkowsky thinks that AI will be super-powerful. Neo-reactionists think that powerful people are powerful and smart for genetic reasons. Therefore, Yudkowsky has something to do with neo-reactionism." Really?

A person is nature plus nurture, and besides, I'm not even sure if DNA alone would produce the same baby. Epigenetics, womb variation, and whatnot all have an effect even before a child is born.

I know! Is the world not more beautiful when one can understand how it works?

"The probability that the universe only has finite space is not exactly 1, is it?"

Nooooo, that's not it. The probability that the reachable space from a particular point within a certain time is finite is effectively one.

So it doesn't matter how large the universe is - the aliens a few trillion ly away cannot have killed Bob.

Hmm? Ah, I see; you think that I am annoyed. No, I only quoted Lumifer because their words nearly sufficed. Rest assured that I do not blame you for lacking the ability to gather information from the future.

(I recognize that you meant instrumental rationality rather than epistemic rationality, and have read the comment with that in mind.)

Epistemic rationality is not equivalent to "being a Spockish asshole." It simply means that one values rationality as an end and not just a means. If you do not value correcting people's grammar for its own sake, then there is no reason to correct someone's grammar. But that is an instrumental statement, so I suppose I should step back...

If you think that epistemic and instrumental rationality would disagree at certain points, try to reconsider their relationship. Any statement of "this ought to be done" is instrumental. Epistemic only covers "this is true/false."

Sounds meaninglessly deep to me.

"See ETA to the comment." Lumifer meant instrumental rationality.

Well, the problem with the Doomsday Argument is not the probability distribution, as I see it, but the assumption that we are "typical humans" with a typical perspective. If you think that the most likely cause for the end of humanity would be predictable and known for millennia, ferex, then the assumption does not hold, as we currently do not see a for-sure-end-of-humanity in our future.

Not entropy, but rather causation; time does not flow backwards because what I do tomorrow will not affect what I did yesterday.

Load More