I have read this post before and have agreed to it. But I read it again just now and have new doubts.
I still agree that beliefs should pay rent in anticipated experiences. But I am not sure any more that the examples stated here demonstrate it.
Consider the example of the tree falling in a forest. Both sides of the argument do have anticipated experiences connected to their beliefs. For the first person, the test of whether a tree makes a sound or not is to place an air vibration detector in the vicinity of the tree and check it later. If it did detect some...
How about this:
People are divided into pairs. Say A and B are in one pair. A gets a map of something that's fairly complex but not too complex. For example, an apartment with a sufficiently large number of rooms. A's task is to describe this to B. Once A and B are both satisfied with the description, B is asked questions about the place the map represented. Here are examples of questions that could be asked:
How many left-turns do you need to make to go from the master bed room to the kitchen?
Which one is the washroom nearest to the game room?
You are sittin...
I will come too.
Hey, I live in Waterloo too. I will join. (Perhaps not this one, but any subsequent ones after the 24th this month that are organized in Waterloo.) Please keep me posted and let me know if you need any help in organizing this.
Pretty neat. Thanks!
If you have many things to do and you are wasting time, then you should number those things from 1 to n and assign n+1 to wasting time and then use http://random.org to generate a random number between 1 and n+1 (1 and n+1 included) to decide what you should do. This adds some excitement and often works.
I live in Waterloo, Ontario (Canada). Does anyone live nearby?
I'm in too.
Consulting a dataset and counting the number of times the event occured and so on would be a rather frequentist way of doing things. If you are a Bayesian, you are supposed to have a probability estimate for any arbitrary hypothesis that's presented to you. You cannot say that oh, I do not have the dataset with me right now, can I get back to you later?
What I was expecting as a reply to my question was something along the following lines. One would first come up with a prior for the hypothesis that the world will be nuked before 2020. Then, one would ident...
So 200:1 is your prior? Then where's the rest of the calculation? Also, how exactly did you come up with the prior? How did you decide that 200:1 is the right place to stop? Or in other words, can you claim that if a completely rational agent had the same information that you have right now, then that agent would also come up with a prior of 200:1? What you have described is just a way of measuring how much you believe in something. But what I am asking is how do you decide how strong your belief should be.
I want to understand Bayesian reasoning in detail, in the sense that, I want to take up a statement that is relevant to our daily life and then try to find exactly how much should I believe in it based on the the beliefs that I already have. I think this might be a good exercise for the LW community? If yes, then let's take up a statement, for example, "The whole world is going to be nuked before 2020." And now, based on whatever you know right now, you should form some percentage of belief in this statement. Can someone please show me exactly how to do that?
Hello.
Now can I get some Karma score please?
Thanks.
The fact that students who are motivated to get good scores in exams very often get better scores than students who are genuinely interested in the subject is probably also an application of Goodhart's Law?
Partially; but a lot of what is being tested is actually skills correlated with being good in exams - working hard, memorisation, bending youself to the rules, ability to learn skill sets even if you don't love them, gaming the system - rather than interest in the subject.
Yes, I should be more specific about 2.
So let's say the following are the first three questions you ask and their answers -
Q1. Do you think A is true? A. Yes. Q2. Do you think A=>B is true? A. Yes. Q3. Do you think B is true? A. No.
At this point, will you conclude that the person you are talking to is not rational? Or will you first want to ask him the following question.
Q4. Do you believe in Modus Ponens?
or in other words,
Q4. Do you think that if A and A=>B are both true then B should also be true?
If you think you should ask this question before dec...
I think one important thing to keep in mind when assigning prior probabilities to yes/no questions is that the probabilities you assign should at least satisfy the axioms of probability. For example, you should definitely not end up assigning equal probabilities to the following three events -
I am not sure if your scheme ensures that this does not happen.
Also, to me, Bayesianism sounds like an iterative way of form...
I have two basic questions that I am confused about. This is probably a good place to ask them.
What probability should you assign as a Bayesian to the answer of a yes/no question being yes if you have absolutely no clue about what the answer should be? For example, let's say you are suddenly sent to the planet Progsta and a Sillpruk comes and asks you whether the game of Doldun will be won by the team Strigli.
Consider the following very interesting game. You have been given a person who will respond to all your yes/no questions by assigning a probabili
I think one thing that evolution could have easily done with our existing hardware is to at least allow us to use rational algorithms whenever it's not intractable to do so. This would have easily eliminated things such as Akrasia, where our rational thoughts do give a solution, but our instincts do not allow us to use them.
There seem to exist certain measures of quality that are second level, in the sense that they measure quality in a kind of indirect way, mostly because the indirect way seems to be easier. One example is sex appeal. The "quality" of a potential mate should be measured just by the number of healthy offsprings it can give birth to. However, that's difficult to find out and hence evolution has programmed our genes to refer to the sex appeal instead, that is, the number of people who will find the person in question attractive. However, the only prob...
I think there's a fundamental flaw in this post.
You're assuming that if we have unlimited willpower, we are actually going to use all of it. Willpower is the ability to do what you think is the most correct thing to do. If what you think is the correct thing to do is actually the correct thing to do, then doing it will, by the definition of correctness, be good. So if you do some "high level reasoning" and conclude that not sleeping for a week is the best thing for you to do and then you use your willpower to do it, it will be the best thing to d...
We realized that one of the very important things that rationalists need is a put down artist community, as opposed to the pick up artist community which already exists but isn't of much use. This is because of the very large number of rationalists who get into relationships but then aren't able to figure out how to get out of them.
So we have three people now. I hope this happens.
Oops, I guess I'm late - I just saw this post. In any case, I will come too.
It will be nice to come up with a more precise definition of 'lowering the status'. For example, if some person treats me like a non-person, all he is doing is expressing his opinion about me being a non-person. This being the opinion of just one person, should not affect my status in the whole society and yet, I feel offended. So the first question is whether this should be called lowering of my status.
Also, let us assume that one person treating me like a non-person does lower my status in some way. Even then, shouting back at him and informing him that ...
Something related happens with me every once in a while when someone makes a statement of the form A -> B and I say 'yes' or 'ok' in response. By saying 'ok' all I am doing is to acknowledge the truth of the statement A -> B, however, in most cases, the person assumes that I am agreeing that A is true and hence ends up concluding that B is true as well.
One example is this -
I go to a stationery shop and ask for an envelope. The storekeeper hands me one and I start inspecting it. The storekeeper observes this and remarks, "If you want a bigger envelope, I can give you one." I say, "Alright." He hands me a bigger envelope.
When someone asks you if you could pass the salt, do you pass the salt? Or just say "Yes"?
Haha! Very curious to know how this turns out!