I don't think I've ever seen the paradox of tolerance used that way. Even in the original formulation from Popper, it's specifically an argument for restricting the principle of tolerance, based on the consequences of society being too tolerant.
The problem with the paradox of tolerance, (as I've seen it used) is people use it as an argument to justify putting limits on the principle which are in fact arbitrary and unjustified; they just say "we can't tolerate the intolerant" as a cached excuse for doing violence to politi...
That's the reason she liked those things in the past, but "acheiving her goals" is redundant, she should have known years in advance about that, so it's clear that she's grown so attached to self-improvement that she sees it as an end in itself. Why else would anyone ever, upon deciding to look inside themselves instead of at expected utility, replace thoughts of paragliding in Jupiter with thoughts of piano lessons?
Hedonism isn't bad, orgasmium is bad because it reduces the complexity of fun to maximising a single number.
I don't want to be upgr...
By believing it's important enough that when you come up with a system of values, you label it a terminal one. You might find that you come up with those just by analysing the values you already have and identifying some as terminal goals, but "She had long been a believer in self-perfection and self-improvement" sounds like something one decides to care about.
I'll add a datapoint to that and say an anonymous site like that is would tempt me enough to actively go and troll even though I'm not usually inclined towards trolling.
Although I picture it getting so immediately overwhelmed by trolls that the fun would disappear; "pissing in an ocean of piss" as 4chan calls it.
"Oh, that's nice."
They wouldn't exactly be accepting the belief as equally valid; religious people already accept that people of other religions have a different faith than they do, and on at least some level they usually have to disagree with "other religions are just as valid as my own" to even call themselves believers of a particular religion, but it gets you to the point of agreeing to disagree.
Since my comment was vague enough to be misunderstood, I'll try to clarify what I thought the first time.
The dialogue reads as a comedy skit where the joke is "theists r dum". The atheist states beliefs that are a parody of certain attitudes of religious believers, and then the theist goes along with an obvious setup they should see coming a mile away. It doesn't seem any more plausible than the classic "rabbit season, duck season" exchange in Looney Tunes, so it's not valuable.
Don't feel I have the attention span (and/or spoons) right now to actually look through the draft, but I note that you mis-spelled "embarrass" while talking about whether you'd embarrassed yourself, which I thought was kinda funny.
Um, not intending to mock, just coincidental placing of a typo I'm sure
Believer: "I say it's duck season, and I say fire!"
Yeah, I don't see any real intellectual value to this.
The usual rule is to identify as an "aspiring rationalist"; identifying rationality as what you are can lead to believing you're less prone to bias than you really are, while identifying it as what you aspire to reminds you to maintain constant vigilance.
I think I can conceive of things that are logically inconsistent. I might just be ignoring the details that make it inconsistent when I do, but other cases where I conceive of a concept but don't keep every detail in mind at once don't seem examples of inconceivability.
Wouldn't the ability to have a false positive for a paradox itself be a sign that people can conceive of things that are paradoxical?
I like "effective egoism" enough already, the alternatives I've seen suggested sound dumb and this one sounds snappy. It might not be perfect for communicating exactly the right message of what the idea is about, but you can do that by explaining, and having a cool name can only be achieved within the name itself.
I accept that meat is more environmentally damaging per calorie (or similar such measures), and with the scale of the meat and dairy industry I'd accept saying it has a huge effect on the environment, but there are several steps between that and "if humanity doesn't go vegan soon, we will probably go extinct".
I didn't click-through and there might be more context than this, but "chances only increase by 2 to 5 percent" is ambiguous between "percent (as an absolute probability)" and "percent (of the chance it was before)". I'm not sure if it qualifies as an "irrationality quote", it's just unclear and could be confusing, but /u/PhilGoetz's version is a step up.
(I'd maybe not use "odds ratio multiplier", because we're not just concerned about clarity, but clarity to people who might be statistically illiterate)
The way the problem reads to me, choosing dust specks means I live in a universe where 3^^^3 of me exist, and choosing torture means 1 of me exist. I prefer that more of myself exist than not, so I should choose specks in this case.
In a choice between "torture for everyone in the universe" and "specks for everyone in the universe", the negative utility of the former obviously outweighs that of the latter, so I should choose specks.
I don't see any incongruity or reason to question my beliefs? I suppose it's meant to be implied that it's ...
She fangirls over the remake? I've never heard the remake described as anything other than some variant of "lifeless", especially from fans of classic Sailor Moon.
EDIT: Forgot it was the positivity thread for a second, let me have another go at that: So I guess maybe I should have another go at the remake! I actually really like being convinced to like a show I was previously "meh" about. Some shows it's more fun to get a hateboner/kismesis thing going for, but Sailor Moon Crystal isn't one of them.
The problem is that ethics can work with other axioms. Someone might be a deontologist, and define ethics around bad actions e.g. "murder is bad", not because the suffering of the victim and their bereaved loved ones is bad but because murder is bad. Such a set of axioms results in a different ethical system than one rooted in consequentialist axioms such as "suffering is bad", but by what measure can you say that the one system is better than the other? The difference is hardly the same as between attempting rationality with empiricism vs without.
Well, I don't think "a bit of a middle-ground" justifies taking a stance calling full-on moral relativism "immoral, pointless & counterproductive".
"Suffering is bad" seems a lot easier to agree on as a premise than it actually is - taken by itself, just about anyone will agree, but taken as a premise for a system it implies a harm-minimising consequentialist ethical framework, which is a minority view.
And it's simple enough to consistently be pro-life but also support the death penalty: if one believes a fetus at whatever ...
You could probably have just covered Ubuntu with "I'm not talking about the OS, I'm talking about a philosophy/ideology used used by Mugabe".
Although as formoral relativism... bad idea by whose standard? By what logic? If it's irrational nonsense to be a moral relativist, do you have a rational argument for moral realism?
If they know that few names from my era, they probably know similarly little about each one. I play "Albert Einstein", but it's obvious to any popsicles from the same era that I'm actually Rick Sanchez. This develops into an in-joke where basically every "Albert Einstein" is really playing Rick Sanchez. We ruin everything with drunken debauchery, then ???, profit, take over the degenerate binge-drinking wasteland society becomes.
If you think this has non-negligible negativity*probability, you've got the conjunction fallacy up the wazoo. Although what it actually reads as is finding a LessWrong framing and context to post the kind of furry hate you'd see in any other web forum, not very constructive.
So I'll respond at the same level of discourse to the scenario: "Bitch, I watched Monster Musume. My anaconda don't want none unless she's part anaconda. Your furfags are tame. Didn't you at least bring back any pegasisters? IWTCIRD!"
Now, not so much being inclined towards tho...
"So, specifically my generation, not my parents' or Queen Victoria's or... yours? That's a bold strategy, let's see if it pays off."
Maybe I have to spend a thousand years entertaining myself by making up total bullshit about my culture to troll the scientists, but eventually some group with completely different political beliefs will takeover, and maybe I'll share the same fate as the zookeepers but I'll damn sure be beaming the smuggest shiteating I-told-you-so grin at the zookeeper while the 41st-century neonazis hang us both in their day of th...
Would seem to imply memories don't make up who you are - I mean, what I'm inclined to read into it is "there are souls and they got moved around", but it could be anything - in which case, if there's a way to cause myself amnesia (and with this level tech why wouldn't there be?) I should just wipe out my memories and find out who I am. Ideally it'll also be possible to save the memories in backups somehow, or I'll have "external memory" like diaries and such, in case I start regretting the decision.
That scenario still sounds awesome, as long as I'm comparing it to "no cryonics" instead of "best-case cryonics scenarios". I get to be dropped into a completely unfamiliar world with just my mind, a small sum of money, and a young healthy body? Sounds like a fun challenge, I mean I died once what have I got to lose?
Yes - I mean existential crisis in the sense of dread and terror from letting my mind dwell on my eventual death, convincing myself I'm immortal is a decisive solution to that insofar as I can actually convince myself. I don't mind existence being meaningless, it is that either way, I care much more about whether it ends.
Storing data that might be used to reconstruct someone in the future isn't really objectionable, but that seems separate from actually using that data to create the resurrection. And it probably works out fine in the utilitarian calculus unless you count the sunk cost vs creating a "better" new person or a utility monster, but bringing someone back to life just because they didn't mention that they didn't want it, or you thought the reason they gave for not wanting it was irrational, sounds really skeevy. We have rules about consent for interacting with other people's bodies, I think that includes implanting their consciousness in new bodies.
I know at least in our specific community, that we'd rather be resurrected than not, and especially in a techno-utopian future, almost goes without saying, but it still worries me that you don't seem to mention consent. At least the top paragraph suggests a third party collecting information about someone else so that they can be resurrected after their death, and even if we skip over the more normal issues with doing that, resurrecting someone without their permission seems like a violation.
In the mix with the problems you've listed under 1. is whether t...
There's a vast difference between being "almost god-like" and being God, and as long as you don't equate the two then there's no contradiction.