“User does not meet the requirements to vote”
What’s going on here? I have a relatively new account and I used to be able to vote, but now I just get this message. I also can’t change votes I’ve already cast.
Hi, I’m a new user who stumbled across this so I figured it would be worth commenting. I came here via effective altruism and have now read a decent chunk of the Sequences so LW is not totally new to me as of reading this but still.
I definitely wish this introduction had been here when I first decided to take a look at LessWrong - it was a little confusing to figure out what the community was even supposed to be. The introductory paragraph is excellent for communicating what the core is that the community is built around, and the following sections seem super efficient at getting us up to speed on... (read more)
Well luckily this question gave me enough karma to upvote your answer :) thanks!
Weird because the comments I made didn’t receive any votes but it seems like I stopped being able to vote after writing them. Unless this requirement was added in the last few weeks which would explain the change.
What’s going on here? I have a relatively new account and I used to be able to vote, but now I just get this message. I also can’t change votes I’ve already cast.
Ethics is (infuriatingly) unique in this aspect.
Discussion of beliefs that do not make observable predictions is unproductive (Making Beliefs Pay Rent), and discussion of beliefs that do not make ANY predictions about ANYTHING EVER is literally meaningless (the different versions of reality are not meaningfully distinguishable).
That said… ethics poses an exception to this rule, because although ethical beliefs don’t make predictions (for anything ever), they still have implications for how you should behave. This is entirely unique to ethical beliefs.
As much as I’d love to do away with the infinite rambling debates over predictionless beliefs, ethics stands in the way. They are beliefs that pay rent not in the currency of predictions to be used to achieve your goals, but in the form of the very goals themselves - an offer so irresistible to instrumental rationalists such as myself, that we will trample far past our ordinary epistemic boundaries to grasp at it.
The point Daniel makes about morality - that your actions if you don’t believe in moral truths should be the same as those if you do - IS relevant to people who care about INSTRUMENTAL epistemic rationality (the irrelevance of this matter is relevant if you get what I mean)
“Mistakenly equivocating” is not quite fair. It’s plainly obvious that he meant “wrong” in the moral sense, considering he literally opened with “if there are no ethical truths…”. (Plus, I’m taking “assume” to mean “act as though” rather than “believe”, which also solves your point of disagreement)
I think the argument that explanations for the blue tentacle are bad because they wouldn’t predict the blue tentacle is flawed.
The theory that you are neither hallucinating nor is there some greater intelligent power places far far lower likelihood on waking up with a blue tentacle than the theory that there is either a greater intelligent power or you are hallucinating, even though both are obscenely low. What matters is the likelihood ratio, so waking up with a blue tentacle is strong evidence that there is either a greater intelligent power or you are hallucinating. In the event that I woke up with a blue tentacle, I would adjust my beliefs and... (read more)