I would consider 3 to be a few.
Do you feel confident that you could recognize a Bitcoin-like opportunity if one did appear, distinguishing it from countless other unlikely investments which go bust?
You should definitely post the entire quote here, not just the snippet with a link to the quote. For a moment I thought the one sentence was the entire quote, and nearly downvoted it for being trite.
While the quote is anti-rationality, it IS satirical, so I suppose it's fine.
I'm fairly confident it stands for "Society for Creative Anachronism".
Too strong.
Nobody EVER got successful from luck? Not even people born billionaires or royalty?
Nobody can EVER be happy without using intelligence? Only if you're using some definition of happiness that includes a term like "Philosophical fulfillment" or some such, which makes the issue tautological.
The quote always annoyed me too. People bring it up for ANY infringement on liberty, often leaving off the words "Essential" and "Temporary", making a much stronger version of the quote (And of course, obviously wrong).
Tangentially, Sword of Good was my introduction to Yudkowsky, and by extension, LW.
The tricky part is the "achievable levels of accuracy". It would be possible for, say Galileo to invent general relativity using the orbit of mercury, probably. But from a pebble, you would need VERY precise measurements, to an absurd level.
Honestly, I did read the source, and it's very difficult to get anything useful out of it. The closest I could interpret it is "Theory (In what? Political Science?) had become removed from "Other fields" (In political science? Science?)".
In general, if context is needed to interpret the quote (I.E. It doesn't stand on it's own), it's good to mention that context in the post, rather than just linking to a source and expecting people to follow a comment thread to understand it.
Sorry if this is overly critical, that was not my intention. I just don't get what the "internecine conflict" you are referring to is.
I'm not really getting anything from this other than "Mainstream philosophy, boo! Empiricism, yeah!"
Is there anything more to this post?
EV (Shot) = -$90 EV (No Shot) = -$104
Difference (Getting the shot minus not getting it) = -$90 - (-$104) = $14
Therefore, get the shot.
The first two values are in the tree. The difference can be figured out by mental arithmetic.
Would that be altruistic value? If I'm not mistaken, the cost of blood donation is generally just time, and the benefit is to other people. I have heard infrequent blood donation might be a health benefit, but I don't know much about that.
Well, if you don't value your health at all, then this seems valid.
I have already gotten a flu shot this year, primarily because the cost of getting one is approximately 10 minutes and 0 USD (They're covered by cost of attendance at my university and in a very convenient location for me).
Also more than have died from UFAI. Clearly that's not worth worrying over either.
I'm not terrified of Ebola because it's been demonstrated to be controllable in fairly developed counties, but as a general rule this quote seems incredibly out of place on less wrong. People here discuss the dangers of things which have literally never happened before almost every day.
My moral position different from (in fact, diametrically opposed to) Alice's, but I'm not going to say that Alice's morals are wrong
You do realize she's implicitly calling you complicit in the perpetuation of the suffering and deaths of millions of animals right? I'm having difficulty understanding how you can NOT say that her morality is wrong. Her ACTIONS are clearly unobjectionable (Eating plants is certainly not worse than eating meat under the vast majority of ethical systems) but her MORALITY is quite controversial. I have a feeling like you ac...
There's no law of physics that talks about morality, certainly. Morals are derived from the human brain though, which is remarkably similar between individuals. With the exception of extreme outliers, possibly involving brain damage, all people feel emotions like happiness, sadness, pain and anger. Shouldn't it be possible to judge most morality on the basis of these common features, making an argument like "wanton murder is bad, because it goes against the empathy your brain evolved to feel, and hurts the survival chance you are born valuing"...
This is a somewhat frustrating situation, where we both seem to agree on what morality is, but are talking over each other. I'll make two points and see if they move the conversation forward:
1: "There's no reason to consider your own value system to be the very best there is"
This seems to be similar to the point I made above about acknowledging on an intellectual level that my (factual) beliefs aren't the absolute best there is. The same logic holds true for morals. I know I'm making some mistakes, but I don't know where those mistakes are. ...
What basis do you have for judging others morality other than your own morality? And if you ARE using your own morality to judge their morality, aren't you really just checking for similarity to your own?
I mean, it's the same way with beliefs. I understand not everything I believe is true, and I thus understand intellectually that someone else might be more correct (or, less wrong, if you will) than me. But in practice, when I'm evaluating others' beliefs I basically compare them with how similar they are to my own. On a particularly contentious issue,...
Edit: I misunderstood what you said by "rationalize", sorry.
As Polymath said, rationalization means "To try to justify an irrational position later"", basically making excuses.
Anyway, I wouldn't worry about the downvotes, based on this post the people downvoting you probably weren't being passive aggressive, but rather misinterpreted what you posted. It can take a little while to learn the local beliefs and jargon.