All of Joshua's Comments + Replies

Joshua00

I'm thinking of being unable to reach a better solution to a problem because what you know conflicts with arriving at the solution.

Say your data leads you to an inaccurate initial conclusion. Everybody agrees on this conclusion. Wouldn't that conclusion be data for more inaccurate conclusions?

So I thought that there would need to be some bias that was put on your reasoning so that occasionally you didn't go with the inaccurate claim. That way if some of the data is wrong you still have rationalists who arrive at a more accurate map.

Tried to unpack it. Noti... (read more)

2Ratheka
I think even a perfect implementation of Bayes would not in and of itself be an AI. By itself, the math doesn't have anything to work on, or any direction to do so. Agency is hard to build, I think. As always, of course, I could be wrong.
0somejan
There's nothing in being a rationalist that prevents you from considering multiple hypotheses. One thing I've not seen elaborated on a lot on this site (but maybe I've just missed it) is that you don't need to commit to one theory or the other, the only time you're forced to commit yourself is if you need to make a choice in your actions. And then you only need to commit for that choice, not for the rest of your life. So a bunch of perfect rationalists who have observed exactly the same events/facts (which of course doesn't happen in real life) would ascribe exactly the same probabilities to a bunch of theories. If new evidence came in they would all switch to the new hypothesis because they were all already contemplating it but considering it less likely than the old hypothesis. The only thing preventing you from considering all possible hypotheses is lack of brain power. This limited resource should probably be divided among the possible theories in the same ratio that you're certain about them, so if you think theory A has a probability of 50% of being right, theory B a probability of 49% and theory C a probability of 1%, you should spend 99% of your efforts on theory A and B. But if the probabilities are 35%, 33% and 32% you should spend almost a third of your resources on theory C. (Assuming the goal is just to find truth, if the theories have other utilities that should be weighted in as well.)
Joshua10

Isn't this exactly what was said in Hug The Query? I'm not sure I understand why you were down voted.

1Kenny
"Belief is not suitable as any kind of evidence when more-direct evidence is available ..." is more like 'You Can Only Ever Hug The Query By Yourself'.
0Blueberry
Caledonian was a well-known LW troll who would frequently make vague, unreadable, critical, somewhat hostile remarks.
Joshua00

While reading through this I ran into a problem. It seems intuitive to me that to be perfectly rational you would have to have instances in which given the same information two rationalists disagreed. I think this because I presume that a lack of randomness leads to a local maxima. Am I missing something?

1Vladimir_Nesov
Unpack "local maxima". Maxima of what?
Joshua10

When I read that the "properly" part really stood out to me. I felt like I was reading about a "true" Scotsman, the sort that would never commit a crime.