Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Kaj_Sotala 02 February 2009 09:53:26AM 9 points [-]

Legalized rape?

Weirdtopia, alright.

Comment author: christopherj 16 March 2015 05:05:57AM -1 points [-]

Weirdtopia? No -- history. For example, the Bible rules allowed for capturing the enemy's women as loot, having sex with their slave, and I'm fairly certain that a woman's wishes in terms of consent mattered a lot less than those of the male in charge of her. I seem to recall that at some point in Europe the feudal lord or whatever could have his way with your wife, and you had no recourse. This, of course, probably has more to do with inequality than anything else.

As for consent, it's ... complicated. For one thing, it exists in the mind and thus cannot reliably leave a physical trace (because of how memory works by retroactively fitting facts into a narrative, not even the owner of the brain can be certain). And then there's sleeping, and drugs, and mental illness, and changing one's mind, and how we decided that none of the usual applies when the person is below a certain age. As a hypothetical example, consider a mute quadriplegic who can only communicate by blinking, gave consent, then withdrew consent halfway through the act but while their partner couldn't see their eyes.

Besides, it's not like any modern society would allow assault or harassment, so if they got rid of the laws concerning the special case where sex is involved, it wouldn't really change much.

Comment author: christopherj 30 May 2014 04:01:35PM 0 points [-]

Pascal's mugging against an actual opponent is easy. If they are able to carry out their threat, they don't need anything you would be able to give them. If the threat is real, you're at their mercy and you have no way of knowing if acceding to their demand will actually make anyone safer, whereas if he's lying you don't want to be giving resources to that sort of person. This situation is a special case of privileging the hypothesis, since for no reason you're considering a nearly impossible event while ignoring all the others.

If we're talking about a metaphor for general decision-making, eg an AI who's actions could well affect the entirety of the human race, it's much harder. I'd probably have it ignore any probabilities below x%, where x is calculated as a probability so small that the AI would be paralyzed from worrying about the huge number of improbable things. Not because it's a good idea, but because as probability approaches zero, the number of things to consider approaches infinity yet processing power is limited.

Comment author: christopherj 16 May 2014 05:33:27AM 0 points [-]

It is pretty much a necessity that humans will believe contradictory things, if only because consistency checking each new belief with each of your current beliefs is impossibly difficult. Cognitive dissonance won't occur if the contradiction is so obscure that you haven't noticed it, or perhaps wouldn't even understand exactly how it contradicts a set of 136 other beliefs even if it was explained to you. Even if you could check for contradictions, your values change drastically from one hour to the next (how much you value food, water, company, solitude, leisure, etc), and that will change all your beliefs that start with "I want ...". Most likely you actually have different bits of brain with different values vying for dominance

Moreover, many times a belief is part of a group membership (eg "I support [cause]", or simply feels good (eg "I am a good person"). People will not appreciate if you point out contradictions in these things, possibly because they are instrumental and not epistemic beliefs. There is no doubt that professing contradictory beliefs can be highly beneficial (eg "Republicans are fiscally conservative, want small government, cut taxes, more money for the military and enforcing morality", if you reject any of that you're not a viable candidate)

Comment author: JTHM 09 May 2014 02:43:28AM *  10 points [-]

Lying constantly about what you believe is all well and good if you have Professor Quirrell-like lying skills and your conscience doesn't bother you if you lie to protect yourself from others' hostility to your views. I myself lie effortlessly, and felt not a shred of guilt when, say, I would hide my atheism to protect myself from the hostility of my very anti-anti-religious father (he's not a believer himself, he's just hostile to atheism for reasons which elude me).

Other people, however, are not so lucky. Some people are obliged to publicly profess belief of some sort or face serious reprisals, and also feel terrible when they lie. Defiance may not be feasible, so they must either use Dark Side Epistemology to convince themselves of what others demand they be convinced, or else be cursed with the retching pain of a guilty conscience.

If you've never found yourself in such a situation, lucky you. But realize that you have it easy.

Comment author: christopherj 16 May 2014 04:32:38AM 1 point [-]

I myself lie effortlessly, and felt not a shred of guilt when, say, I would hide my atheism to protect myself from the hostility of my very anti-anti-religious father (he's not a believer himself, he's just hostile to atheism for reasons which elude me).

Hm, an atheist who hides his atheism, from his father who also seems to be an atheist (aka non-believer) but acts hostile towards atheists? Just out of curiosity, do you also act hostile towards atheists when you're around him?

Comment author: fezziwig 30 April 2014 02:49:17PM 2 points [-]

FWIW these questions have standard answers in Christian doctrine: he didn't want to be tortured to death, but he wanted to do God's will more than he not-wanted to be crucified. Part of the point of the story is that you don't have to cheerfully volunteer, you just have to volunteer. It's ok to be sad or afraid.

Comment author: christopherj 10 May 2014 02:35:30PM 0 points [-]

FWIW these questions have standard answers in Christian doctrine: he didn't want to be tortured to death, but he wanted to do God's will more than he not-wanted to be crucified.

Sure, but don't forget that in Christian doctrine Jesus=God. This vastly complicates the issue, God-the-Father demands that God-the-Son die on behalf of the sins of humanity, which God-the-Son doesn't want to do but is willing to do because it's what God-the-Father requires to bring Himself to forgive people and He may have been ordered to as well. I don't know what would happen if God disobeys Himself.

Comment author: christopherj 06 May 2014 12:00:06AM 1 point [-]

Summary: Agreeing with people who insufficiently ironman an argument, will be treated as agreeing that the argument is complete rubbish.

Comment author: christopherj 06 May 2014 12:11:21AM 0 points [-]

And I expect the reason is that people who insufficiently ironman an argument are either more interested in the argument's technical correctness, or more interested in discrediting the claim.

Comment author: christopherj 06 May 2014 12:00:06AM 1 point [-]

Summary: Agreeing with people who insufficiently ironman an argument, will be treated as agreeing that the argument is complete rubbish.

Comment author: christopherj 04 May 2014 05:14:02AM 1 point [-]

Supplemental data preservation seems like a synergistic match with cryonics. You'd want to collect vast amounts of data with little effort, so no diaries or random typing or asking friends to memorize facts. MRIs and other medical records might help, keeping a video or audio recording of everything you do, and recording everything you do with your computer, should take little time and might preserve something that might aid cryonic preservation.

Simulation-based preservation attempts may be more likely than people expect, based on the logic that simulated humans likely outnumber physical humans (we could be in a simulation to determine how many simulations per human we will eventually make ourselves). However it is clear that the simulator(s) either already are communicating with us or do not care to, and to gain any more direct access to their attention we'd have to hack the simulation, in which case there may be more clever things to do than call attention to our hacking. However, it is likely that the simulators have highly advanced security technology compared to us. Alternately, given that we are probably being simulated by other humans, and they might be watching, we may be able to appeal to their empathy.

Evolutionary Preservation and Genetic Preservation depend on a misunderstanding of genetics, Philosophical Preservation on a misunderstanding of the natures of reality vs rationalization, and Time-travel Preservation suggests that making a commitment to something that 10%-50% of humans already made will make you notable to time travelers. This sort of thing detracts from your suggestion since you're grasping at straws to find alternatives.

Granted, it's hard to find alternatives. I suppose EEG data could be collected as well, and would also have research benefits. However, like most of the other data that could be collected, it would probably only suffice as a sanity check on your cryonic reconstruction.

Comment author: asr 02 May 2014 06:43:43AM *  1 point [-]

Another advantage of replicating the original discovery is that you don't accidentally use unverified equipment or discoveries (ie equipment dependent on laws that were unknown at the time).

I don't consider this an advantage. My goal is to find vivid and direct demonstrations of scientific truths, and so I am happy to use things that are commonplace today, like telephones, computers, cameras, or what-have-you.

That said, I certainly would be interested in hearing about cases where there's something easy to see today that used to be hard -- is there something you have in mind?

Comment author: christopherj 02 May 2014 01:50:09PM 1 point [-]

I don't consider this an advantage. My goal is to find vivid and direct demonstrations of scientific truths, and so I am happy to use things that are commonplace today, like telephones, computers, cameras, or what-have-you.

Well, you could use your smartphone's accelerometer to verify the equations for centrifugal force, or its GPS to verify parts of special and general relativity, or the fact that its chip functions to verify parts of quantum mechanics. But I'm not sure how you can legitimately claim to be verifying anything; if you don't trust those laws how can you trust the phone? It would be like using a laser rangefinder to verify the speed of light. For this sort of thing the fact that your equipment functions is better evidence that the people who made it know the laws of physics, than any test you could do with it.

Comment author: christopherj 02 May 2014 01:25:58AM 1 point [-]

Some hypotheses: 1) Words in the foreign language are not tainted with morality. Using more neutral words in the problem description would have a similar effect.

2) The extra time taken to parse the foreign language description forces more time to think about the problem. Saying the problem slowly, or writing with a huge font, would have a similar effect.

3) The distraction of translating has an effect. Giving the subjects an additional task to do would have a similar effect.

Other studies showed an effect of language helping to discriminate between things like two different colors (aided if your language uses different words for them). That seemed like a different thing, perhaps an effect of categories and practice.

View more: Next