All of jwoodward48's Comments + Replies

"Yudkowsky founded the Machine Intelligence Research Institute, with the money of venture capitalist Peter Thiel, which focusses on the development of a benevolent AI. It should be benevolent, because it will have power over people, when it gets smarter than them. For the neoreactionaries, Intelligence is necessarily related to politics. based on the concept of human biodiversity, they believe that social and economical differences are caused and justified by a genetic difference in intelligence among ethnic groups. They reject the idea of a common hu... (read more)

A person is nature plus nurture, and besides, I'm not even sure if DNA alone would produce the same baby. Epigenetics, womb variation, and whatnot all have an effect even before a child is born.

I know! Is the world not more beautiful when one can understand how it works?

"The probability that the universe only has finite space is not exactly 1, is it?"

Nooooo, that's not it. The probability that the reachable space from a particular point within a certain time is finite is effectively one.

So it doesn't matter how large the universe is - the aliens a few trillion ly away cannot have killed Bob.

Hmm? Ah, I see; you think that I am annoyed. No, I only quoted Lumifer because their words nearly sufficed. Rest assured that I do not blame you for lacking the ability to gather information from the future.

(I recognize that you meant instrumental rationality rather than epistemic rationality, and have read the comment with that in mind.)

Epistemic rationality is not equivalent to "being a Spockish asshole." It simply means that one values rationality as an end and not just a means. If you do not value correcting people's grammar for its own sake, then there is no reason to correct someone's grammar. But that is an instrumental statement, so I suppose I should step back...

If you think that epistemic and instrumental rationality would disagree at cert... (read more)

0Lumifer
Yes, of course. Notably, epistemic rationality only requires you to look for and to prefer truth. It does not require you to shove the truth you found into everyone else's face. One can find edge cases, but generally speaking if you treat epistemic rationality narrowly (see above) I would expect such a disagreement to arise very rarely. On the other hand there are, as usual, complications :-/ For example, you might not go find the truth because doing this requires resources (e.g. time) and you feel these resources would be better spent elsewhere. Or if you think you have difficulties controlling your mind (see the rider and the elephant metaphor) you might find useful some tricks which involve deliberate denial of some information to yourself.

Sounds meaninglessly deep to me.

0Jiro
It isn't. It's meant to point out that calling something a 'tax on stupidity" is itself meaninglessly deep-sounding. Intelligence is used for pretty much everything; calling something a tax on stupidity says nothing more about it than "it's part of the world".

"See ETA to the comment." Lumifer meant instrumental rationality.

0Elo
Comment was before his eta. Ta.

Well, the problem with the Doomsday Argument is not the probability distribution, as I see it, but the assumption that we are "typical humans" with a typical perspective. If you think that the most likely cause for the end of humanity would be predictable and known for millennia, ferex, then the assumption does not hold, as we currently do not see a for-sure-end-of-humanity in our future.

Not entropy, but rather causation; time does not flow backwards because what I do tomorrow will not affect what I did yesterday.

"Why do people have a tendency to believe that their minds are somehow separate from the rest of the universe?"

Because the concept of self as distinct from one's surroundings is part of subjective experience. Heck, I'd consider it to be one of the defining qualities of a person/mind.

What is a "reason"? Nothing but a cause (that is meaningfully, reasonably, and predictably tied to the effect, perhaps). The only cases in which a mind has a spontaneous thought (that is, one with no reason for them), are "brain static" and Boltzmann brains. So your question is essentially reducible to the question of "Why am I not a Boltzmann brain?"

Edit: I'm not really sure that "reason" is equivalent to "cause", on further reflection. There needs to be a deeper connection between A and B, if A is said to... (read more)

Ah, but that's assuming that their goal is "get into a band," rather than "attain a new and interesting hobby."