What are rationalist presumptions?
I am new to this rationality and Bayesian ways of thinking. I am reading the sequence, but I have few questions along the way. These questions is from the first article (http://lesswrong.com/lw/31/what_do_we_mean_by_rationality/)
Epistemic rationality
I suppose we do presume things, like we are not dreaming/under global and permanent illusion by a demon/a brain in a vat/in a Truman show/in a matrix. And, sufficiently frequently, you mean what I think you meant. I am wondering, if there is a list of things that rationalist presume and take for granted without further proof. Are there anything that is self evident?
Instrumental rationality
Sometimes a value could derive from other value. (e.g. I do not value monarchy because I hold the value that all men are created equal). But either we have circular values or we take some value to be evident (We hold these truths to be self-evident, that all men are created equal). I think circular values make no sense. So my question is, what are the values that most rationalists agree to be intrinsically valuable, or self evident, or could be presumed to be valuable in and of itself?
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Personally, I feel like nothing great has happened in the field of ML after Lee Segol defeat and all field has slowed down in last 6 months.
uhh, why would highly visible events necessarily correlate to the most important for timeline stuff?