I believe the evidence is that the initial urge of A is more credible than the rationalization of B. That is, when students change answers on multiple choice tests, they are more likely to turn a right answer to a wrong answer than a wrong answer to a right answer. (I don't know if that generalizes to a true-false setting.)
It matters why "B sounds more plausible to your mind." If it's because you remembered a new fact, or if you reworked the problem and came out with B, change the answer (after checking that your work was correct and everything.) The many multiple choice tests are written so that there is one right answer, one wrong answer, and two plausible-sounding answers, so you shouldn't change an answer just because B is starting to sound plausible.
"No, no!" says the philosopher. "In the thought experiment, they aren't randomly generating lots of GLUTs, and then using a conscious algorithm to pick out one GLUT that seems humanlike! I am specifying that, in this thought experiment, they reach into the inconceivably vast GLUT bin, and by pure chance pull out a GLUT that is identical to a human brain's inputs and outputs! There! I've got you cornered now! You can't play Follow-The-Improbability any further!"
In my (limited) understanding of the way the universe began, it was all pretty random.
Evolution seems to have been pretty random, too.
So how did we end up being concious?
And I was also wondering, does "randomness" exist? Or was the history of the universe set from the moment of the big bang?
(Please, I'm not trying to be clever, I just want to know the answer!)
Sounds like zombies to me. Does the robot know he's a robot?
Could someone please tell me why that comment was voted down?
I'm not trying to be sarcastic or anything, I just want to know.
Can someone tell me, or is there a list somewhere, "all the other things that rationalists are supposed to say on such occasions"?
I find that having bits that come to mind automatically in certain situations really helps me to go about thinking in the right way (or at least a way that's less wrong.)
Could someone please explain to me why this is downvoted?
I'm not trying to be sarcastic or anything, and the comment above was sincere.
I just want to know what I said wrong.
Thank you.
Why do I think I have free will?
I think I have free will because I tell my hand to type and it types.
And why do I think that that was my own free will and not somebody or something else's?
Wait, what do I even mean when I say "free will"?
I mean that I could do whatever I wanted to.
And what controls what I "want" to do? Is it me or something/one else?
Why do I think that I control my own thoughts?
My thoughts seem instantaneous, maybe I don't control my own thoughts.
I can say things without thinking about it beforehand, sometimes I agonize over a decision (It's a Saturday, should I get out of bed right now or later?) and I choose one decision without coming to a conclusion and without knowing why I chose it.
Maybe, subconsiously, I was hungry, or obeying a habit.
If I was hungry, or if some other instinct was propelling me, then I don't really have free will when it comes to simple things like this, although "I" can override my instincts, so it's my instincts serving me, as a mental shortcut, and I am not a slave to it, so I do have free will.
If it was a habit, it was I who created my habits by repetition, so I have free will. I can also override my habits.
Who's to say that my overrides aren't controled by something/one?
I feel like I have free will, but maybe that's how whatever controls me "wants" me to feel.
Maybe I'm just a zombie, writing paragraphs on free will because the laws of nature are making me do it.
In that case, how am I supposed to assume that I am, in fact, correct about me having free will?
So I don't have free will at all? Is that the answer that other people have gotten to? Are there gaping holes (or even tiny holes) in my logic, and are there angles that I havn't considered yet?
I still feel like I have free will. Maybe I should have written that like, 'I still feel like I have "free will".'
This may be like the time the math teacher told me to prove that two lines were parallel and I couldn't because I didn't know about Thales' theorem.
Could someone please help me figure this out? I don't see a way to continue from, "Either I have free will, or who/whatever is controling me is making me think that I have free will." I'm not sure how those two universes would be different.
Edit: In a universe where someone is controling me, I'm guessing "he" would have a plot in mind. The universe doesn't appear to have a plot, but maybe I'm just too small to see it, or- wait, who says the universe doesn't appear to have a plot? I don't think I know enough to answer this question. Help?
Why do humans think that they have free will?
What kind of situation would favour humans who thought that they had free will over humans who didn't?
Will to survive?
No, that's not the right question, I'm off track.
I'm drawing a complete blank.
What is there in my head that makes it so that I think I have free will?
I keep thinking in circles. I'm trying to differentiate the answer of this question from the answer of the question "Why do I think I have free will?", but every time I get close there is litterally a giant blank, I don't think I know enough about how human brains work in the first place in order to answer this question.
Oh, no, here we go:
Why do I think that I don't know enough about how my mind works to answer this question? I live in it, after all.
Well, I can't answer the question, that seems like ample proof to me, although it might not be.
I think that I could work out everything I needed to know given enough time, but why start from scratch when other great minds have done the work for you?
Can anyone direct me to some ressources I can use to better understand the internal algorithms of the human mind please?
My chess playing software considers options and makes a decision. Does it have free will?
If an abstract theory (such as the whole universe being governed by billiard ball causation) contradicts a direct observation, you don't say the observation is wrong, you say the theory is.
Your chess playing software must make the decision that is most likely to win the game, wheras humans don't have anything to stop us making the bad decision.
"If we cannot learn to take joy in the merely real, our lives will be empty indeed."
It's true... but... why do we read sci-fi books then? Why should we? I don't think that after reading a novel about intelligent, faster-than-light starships the bus stopping at the bus stop nearby will be as interesting as it used to be when we were watching it on the way to the kindergarten... Or do you think it is? (Without imagining starships in place of buses, of course.)
So what non-existing things should we imagine to be rational (= to win), and how? I hope there will be some words about that in tomorrow's post, too...
That doesn't mean that we can't take joy in what is not merely real, nor that we should be delighted everytime we see the bus stopping at the bus stop.
There are four types of things in the world:
* Things that are real and uninteresting.
* Things that are real and interesting.
* Things that are unreal and uninteresting.
* Things that are unreal and interesting.
I assume that no one would invent something unreal and uninteresting, so that leaves us with three categories.
In this article, Eliezer argues that the category real and interesting exists.
He doesn't say that the two remaining categories don't exist.
So feel free to enjoy your unreal, interesting sci-fi, and to disregard the real, uninteresting bus stops.
(Not that I'm implying that bus stops and other mundane things can't be interesting as well, but no one is interested in everything.)
I find that thinking this way gives us a better perspective on a lot of things, like when people say, "People only want what's bad for them."
(Um, I can't figure out how to do bulleted lists. I've copied the little asterisk thing directly from the help page, but I still can't get it to work. Could someone tell me what I've done wrong?)
One man's modus tollens is another man's modus ponens. Reductionism is true; therefore, there is, in fact, no "free will" in the sense that Ian C. seems to be implying. ;)
I can't predict the tomorrow's weather; does that mean atmospheres have free will?
Why do I think I have free will?
I think I have free will because I tell my hand to type and it types.
And why do I think that that was my own free will and not somebody or something else's?
Wait, what do I even mean when I say "free will"?
I mean that I could do whatever I wanted to.
And what controls what I "want" to do? Is it me or something/one else?
Why do I think that I control my own thoughts?
My thoughts seem instantaneous, maybe I don't control my own thoughts.
I can say things without thinking about it beforehand, sometimes I agonize over a decision (It's a Saturday, should I get out of bed right now or later?) and I choose one decision without coming to a conclusion and without knowing why I chose it.
Maybe, subconsiously, I was hungry, or obeying a habit.
If I was hungry, or if some other instinct was propelling me, then I don't really have free will when it comes to simple things like this, although "I" can override my instincts, so it's my instincts serving me, as a mental shortcut, and I am not a slave to it, so I do have free will.
If it was a habit, it was I who created my habits by repetition, so I have free will. I can also override my habits.
Who's to say that my overrides aren't controled by something/one?
I feel like I have free will, but maybe that's how whatever controls me "wants" me to feel.
Maybe I'm just a zombie, writing paragraphs on free will because the laws of nature are making me do it.
In that case, how am I supposed to assume that I am, in fact, correct about me having free will?
So I don't have free will at all? Is that the answer that other people have gotten to? Are there gaping holes (or even tiny holes) in my logic, and are there angles that I havn't considered yet?
I still feel like I have free will. Maybe I should have written that like, 'I still feel like I have "free will".'
This may be like the time the math teacher told me to prove that two lines were parallel and I couldn't because I didn't know about Thales' theorem.
Could someone please help me figure this out? I don't see a way to continue from, "Either I have free will, or who/whatever is controling me is making me think that I have free will." I'm not sure how those two universes would be different.
Edit: In a universe where someone is controling me, I'm guessing "he" would have a plot in mind. The universe doesn't appear to have a plot, but maybe I'm just too small to see it, or- wait, who says the universe doesn't appear to have a plot? I don't think I know enough to answer this question. Help?
The other antireductionism argument I can think of looks a little like this:
Anti-reductionist: "If the laws of physics are sufficient to explain reality, then that leaves no room for God or the soul. God and souls exist, therefore reductionism is false."
And the obvious counterargument, is of course...
Reductionist: "One man's modus tollens is another man's modus ponens. Reductionism is true; therefore, there is, in fact, no God."
At this point, the anti-reductionist gathers a lynch mob and has the reductionist burned at the stake for heresy.
It's still possible to have a little bit of respect for people who are obviously wrong.
I read this book once about how when we're looking at other people who we know are wrong we have to see their ignorence and try to solve it instead of making them into the enemy. We have to see the disease behind the person.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Same here.
And here. Maybe we could start with probability theory, seeing as how that seems to be really central to this site.