Comment author: Vladimir_Nesov2 29 December 2007 10:44:34PM 0 points [-]

Caledonian, I think you are confusing goals with truths. If truth is that the goal consists in certain things, rationality doesn't oppose it in any way. It is merely a tool that optimizes performance, not an arbitrary moral constraint.

Comment author: Vladimir_Nesov2 29 December 2007 08:24:12PM 0 points [-]

OC: Eliezer, enough with your nonsense about cryonicism, life-extensionism, trans-humanism, and the singularity. These things have nothing to do with overcoming bias. They are just your arbitrary beliefs.

I guess it's other way around: the point of most of the questions raised by Eliezer is to take a debiased look at controversial issues such as those you list, to hopefully build a solid case for sensible versions of them. For example, existing articles can point at fallacies in your assertions: you assume cryonics, etc. to be separate magisteria outside of the domain of rationality and argue from apparent absurdity of these issues.

Comment author: Vladimir_Nesov2 29 December 2007 11:58:47AM 1 point [-]

Eliezer,

Your accent on leadership in this context seems strange: it was in no one's interest to leave, so biased decision was to follow you, not hesitation in choosing to lead others outside.

Comment author: Vladimir_Nesov2 26 December 2007 09:48:07AM 1 point [-]

It feels like there was no explicit rule not to ask questions. It's interesting what percentage of subjects actually questioned the process.

If people are conforming rationally, then the opinion of 15 other subjects should be substantially stronger evidence than the opinion of 3 other subjects.

I don't see how moderate number of other wrong-answering subjects should influence decision of rational subject, even if it's strictly speaking stronger evidence, as uncertainty in your own sanity should be much lower than probability of alternative explanations for wrong answers of other subjects.

Comment author: Vladimir_Nesov2 18 November 2007 11:40:10AM 1 point [-]

Since there are insanely many slightly different outcomes, terminal value is also too big to be considered. So it's useless to pose a question of making a difference between terminal values and instrumental values, since you can't reason about specific terminal values anyway. All things you can reason about are instrumental values.

In response to Rationalization
Comment author: Vladimir_Nesov2 01 October 2007 05:37:04PM 0 points [-]

Eliezer: An intuitive guess is non-scientific but not non-rational

It doesn't affect my point; but do you argue that intuitive reasoning can be made free of bias?

In response to Applause Lights
Comment author: Vladimir_Nesov2 11 September 2007 09:28:10PM 2 points [-]

Such speech could theoretically perform "bringing to attention" function. Chunks of "bringing to attention" are equivalent to any kind of knowledge, it's just an inefficient form, and abnormality of that speech in its utter inefficiency, not lack of content. People can bear such talk as similar inefficiency can be present in other talks in different form. Inefficiency makes it much simpler to obfuscate eluding certain topics.

In response to Fake Causality
Comment author: Vladimir_Nesov2 24 August 2007 01:04:25AM 1 point [-]

Phlogiston is not necessarily a bad thing. Concepts are utilized in reasoning to reduce and structure search space. Concepts can be placed in correspondence with multitude of contexts, selecting a branch with required properties, which correlate with its usage. In this case active 'phlogiston' concept correlates with presence of fire. Unifying all processes that exhibit fire under this tag can help in development of induction contexts. Process of this refinement includes examination of protocols which include 'phlogiston' concept. It's just not a causal model, which can rigorously predict nontrivial results through deduction.

Comment author: Vladimir_Nesov2 15 August 2007 09:36:07PM 2 points [-]

Just a question of bookkeeping - online confidence update can be no less misleading, even if all facts are processed once. Million negative arguments can have negligible total effect if they happen to be dependent in non-obvious way.

Comment author: Vladimir_Nesov2 29 July 2007 10:01:17AM 2 points [-]

Some ungrounded concepts can produce your own behavior which in itself can be experienced, so it's difficult to draw the line just by requiring concepts to be grounded. You believe that you believe in something, because you experience yourself acting in a way consistent with you believing in it. It can define intrinsic goal system, point in mind design space as you call it. So one can't abolish all such concepts, only resist acquiring them.

View more: Prev | Next