I think these are very good examples, I would agree with C), disagree with D), require clarification on B) and have no strong opinion on A). Others might have different opinions. I further think that without amassing a wealth of examples like this and selecting a subset where there is a general agreement on which side of the fence they lie is necessary for a productive discussion of the issue.
If you intend to try again in the current open thread, feel free to transfer the examples.
Trying to clarify my intuitions re. B:
Consider Paul Atreides undergoing the gom jabbar; he will die unless he keeps his hand in the box. Given that he knows this, I count his success as a freely willed action; if (counterfactually) the pain had been sufficient to overcome him, withdrawing his hand would not have been freely willed, because it is counter to his consciously endorsed values (and, in this case, not subtle or confused values).
However, if (also counterfactually) the threat of death had not been present or known to him, then withdrawing his hand may have been a freely willed act (if the pain built slowly enough to be noticed rather than just triggering a burn-reaction).
By extensional definition I mean fencing off the notion of free will with a set of reasonably sharp (close to the free will/not free will boundary) examples of not having free will.
A rock not having free will is uncontroversial, but not sharp (very far from the boundary). I am looking for a set of examples where most people would agree that
It is an example of not having free will (uncontroversial)
It is hard to move it toward the "definitely free will" case without major disagreements from others (reasonably sharp).
Pretty sure I'm misparsing you somehow, but here are some things I might consider nonfree action :
A) an action is rewarded with a heroin fix; the actor is in withdrawal
B) an action will relieve extreme and urgent pain
C) an action is demanded by reflex (e.g. withdrawal from heat)
D) an action is demanded by an irresistably salient emotional appeal that the agent does not reflectively endorse (release the country-slaying neurotoxin, or I shall shoot your child)
Can someone give an extensional definition of free will? Or link to one.
Are you asking for a procedure for identifying acts of free will (the doable kind of extensive definition) or a set of in-out exemplars (ostensive definition)?
Meetup : LW London regular meetup
Discussion article for the meetup : LW London regular meetup
Edit: weather is a bit meh; will start off in the SH.
The next LW London meetup will be on June 28th. Join us from 2pm to talk about vote trading, Boltzmann chickens, and ethical Bitcoin fanfiction. Or something. If the weather is nice, we'll be in Lincoln's Inn Fields. If not, we'll be in our usual Shakespeare's Head, just around the corner. My number is 07860 466862, call it if you have difficulty finding us, and ideally not otherwise. About London LessWrong: We run this meetup approximately every other week; these days we tend to get in the region of 5-15 people in attendance. By default, meetups are just unstructured social discussion about whatever strikes our fancy: books we're reading, recent posts on LW/related blogs, logic puzzles, toilet usage statistics.... Sometimes we play The Resistance or other games. We usually finish around 7pm, give or take an hour, but people arrive and leave whenever suits them. Related discussion happens on both our google group and our facebook group.
Discussion article for the meetup : LW London regular meetup
Meetup : LW London regular meetup
Discussion article for the meetup : LW London regular meetup
The next LW London meetup will be on June 14th. Join us from 2pm to talk about vote trading, Boltzmann chickens, and ethical Bitcoin fanfiction. Or something.
If the weather is nice, we'll be in Lincoln's Inn Fields. If not, we'll be in our usual Shakespeare's Head, just around the corner. My number is 07860 466862, call it if you have difficulty finding us, and ideally not otherwise.
About London LessWrong:
We run this meetup approximately every other week; these days we tend to get in the region of 5-15 people in attendance. By default, meetups are just unstructured social discussion about whatever strikes our fancy: books we're reading, recent posts on LW/related blogs, logic puzzles, toilet usage statistics....
Sometimes we play The Resistance or other games. We usually finish around 7pm, give or take an hour, but people arrive and leave whenever suits them.
Related discussion happens on both our google group and our facebook group.
Discussion article for the meetup : LW London regular meetup
I agree with you there - what I mean by selfish preferences is that after the copies are made, each copy will value a cookie for itself more than a cookie for the other copy - it's possible that they wouldn't buy their copy a cookie for $1, but would buy themselves a cookie for $1. This is the indexically-selfish case of the sort of preferences people have that cause them to buy themselves a $1 cookie rather than giving that $1 to GiveDirectly (which is what they'd do if they made their precommitments behind a Rawlsian veil of ignorance).
Confused. What's incoherent about caring equally about copies of myself, and less about everyone else?
TV and Movies (Animation) Thread
I've just finished marathoning the first 1.5 seasons (to the current cliffhanger/hiatus) of Gravity Falls, and strongly recommend it. Supernatural mystery/horror/comedy, significantly darker than Disney usually gets. High levels of continuity; very strong art direction; near-HPMOR levels of foreshadowing/conservation of detail (I advise not reading about it beforehand as there was a similar hivemind-predictive-success of the biggest twist). Secret codes, cryptic Reddit AMAs, trolling creators with hand puppets, all the good stuff.
The unbreakable vow is basically giving people the death penalty with no way to ask for any kind of exemption due to unforeseen circumstances. It's not something to be used lightly. Also, in Methods of Rationality someone permanently has to lose some magic, which is also something not to be used lightly.
Don't follow. You see "making an actually binding promise" as equivalent to dying?
Right. This is why I said that total obliviation is worse than death. Not only are you removed, you can later be used to support purposes outright opposed to your goals, as Harry intends to do with Voldemort.
This seems odd to me, though I'm not saying you're wrong. From the inside, my values seem far more akin to habits or reflexes than to time-indexed memories.
I imagine Obliviated!me still having a NO DON'T reaction when asked to support a purpose opposed to my previous goals, because verbalised goals flow from wordless moral habits; not the other way around. (assuming a possibly inconsistent scenario where I retain enough language for someone to expect to manipulate me)
View more: Next
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I think I can make this! Any tips for identifying the group?
Apologies for no response; I vaguely assumed I would get a notification if anyone commented. I think we'll start in the Shakespeare's Head as it's a bit cloudy. There will be a sign up. Otherwise, climb the nerd gradient until you find us; we're usually in the back third past the bar.