Will_Sawin comments on The Neglected Virtue of Scholarship - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (153)
I'm not really interested in decision theory. It is one of several fun things I like to think about. To demonstrate an extreme version of this attitude, I am thinking about a math problem right now. I know that there is a solution in the literature - someone told me. I do not plan to find that solution in the literature.
Now, I am more interested in getting the correct answer vs. finding the answer myself in decision theory than that. But the primary reason I think about decision theory is not because I want to know the answer. So if someone was like, "here's a paper that I think contains important insights on this problem," I'd read it, but if they were like, "here's a bunch of papers written by a community whose biases you find personally annoying and do not think are conducive to solving this particular problem, some of which probably contain some insights," I'll be more wary.
It should be noted that I do agree with your point to some extent, which is why we are having this discussion.
Indeed.
That did not appear to be the case when I looked.
which you linked to because, AFAICT, it is one of only three SEP pages that mentions Newcomb's Problem, two of which I have read the relevant parts of and one of which I will soon.
To see that he has insights, you just need to read his blog posts, although to be fair many of the ideas get less than a lesswrong-length post of explanation.
I'd expect the best ones to.
It seems like, once I exhaust your limited but easily-accessible knowledge, which seems like about now, I should look up philosophical decision theory papers at the same leisurely pace I think about decision theory. My university should have some sort of database.
It seems like it does just the wrong thing to me. For example, it two-boxes on Newcomb's problem.
However, the amount of sense it seems to make leads me to suspect that I don't understand it. When I have time, I will read the appropriate paper(s?) until I'm certain I understand what he means.
TDT and UDT as currently formulated would make the correct counterfactual prediction:
"If I go to Damascus, I'll die, if I go to Aleppo, I die, if I use a source of bits that Death doesn't have access to, I'll live with probability 1/2."
which avoids decision instability, and, in general, don't let you consider your decisions in view of your decisions.
I was aware of the existence of papers, and I knew some of the main ideas that were contained in them.
There is something about academic philosophy that is not conducive to coming to conclusions about problems and then moving on to other, harder problems, at nearly the rate many other academic disciplines do so. Clearly some of this is based on philosophy being hard, but some of it is also based on the collective irrationality of philosophers.
I don't know as much as I should. I know some.
Writing up a large collection of true statements of philosophy that contains very few false statements of philosophy, while not much of an achievement, is an indicator of what I think is the right attitude, especially for problems like decision theory.
AI theory is also an enormous intuition pump for this type of problem.
Considering the outside view leads me to two conclusions:
You're right.
The best way to make progress on DT is to, if possible, get our ideas published, thus allowing TDT and academic philosophy's ideas to mingle and recombine into superior ideas in the minds of more than O(5) people. Alternately, if TDT sucks then attempting to do this will lead to the creation by academic philosophers of strong arguments why TDT sucks that will also help figure out the problem.
I believe my current planned actions WRT reading philosophy papers are sufficient to cover the outside and inside evidence for 1, and I"m trying to figure out if there are better strategies than Eliezer's current one for 2 and what the costs are.