Comment author: satt 04 May 2014 11:36:32PM 1 point [-]

My attempt to boil the post down to a one sentence conclusion: being super into epistemic rationality is a very good thing, but it is not the only good thing.

Comment author: alicey 05 May 2014 12:38:44AM *  2 points [-]

oh, sure

Comment author: alicey 04 May 2014 11:05:16PM *  3 points [-]

i'm into epistemic rationality, but this all seems pretty much accurate and stuff

not sure what to conclude from having that reaction to this post.

Comment author: TheOtherDave 01 March 2014 09:30:38PM 2 points [-]

I understand this to mean that the only value you see to non-brevity is its higher success at manipulation.

Is that in fact what you meant?

Comment author: alicey 14 March 2014 11:49:31PM *  0 points [-]

-

Comment author: ThrustVectoring 02 March 2014 10:57:28AM 1 point [-]

I suspect that the issue is not terseness, but rather not understanding and bridging the inferential distance between you and your audience. It's hard for me to say more without a specific example.

Comment author: alicey 14 March 2014 11:42:11PM 0 points [-]

revisiting this, i consider that perhaps i am likely to discard parts of the frame message and possibly outer message - because, to me of course it's a message, and to me of course the meaning of (say) "belief" is roughly what http://wiki.lesswrong.com/wiki/Belief says it is

Comment author: jamesf 02 March 2014 03:27:05AM *  0 points [-]

What does brevity offer you that makes it worthwhile, even when it impedes communication?

Predicting how communication will fail is generally Really Hard, but it's a good opportunity to refine your models of specific people and groups of people.

Comment author: alicey 14 March 2014 11:29:40PM 0 points [-]

improving signal to noise, holding the signal constant, is brevity

when brevity impedes communication, but only with a subset of people, then the reduced signal is because they're not good at understanding brief things, so it is worth not being brief with them, but it's not fun

Comment author: TheOtherDave 01 March 2014 04:40:02PM 2 points [-]

Well, you describe the problem as terseness.
If that's true, it suggests that one set of improvements might involve explaining your ideas more fully and providing more of your reasons for considering those ideas true and relevant and important.

Have you tried that?
If so, what has the result been?

Comment author: alicey 01 March 2014 05:58:28PM *  0 points [-]

-

Comment author: 7EE1D988 01 March 2014 10:58:35AM 12 points [-]

I can see benefits to the principle of charity. It helps avoid flame wars, and from a Machiavellian point of view it's nice to close off the "what I actually meant was..." responses.

Some people are just bad at explaining their ideas correctly (too hasty, didn't reread themselves, not a high enough verbal SAT, foreign mother tongue, inferential distance, etc.), others are just bad at reading and understanding other's ideas correctly (too hasty, didn't read the whole argument before replying, glossed over that one word which changed the whole meaning of a sentence, etc.).

I've seen many poorly explained arguments which I could understand as true or at least pointing in interesting directions, which were summarily ignored or shot down by uncharitable readers.

Comment author: alicey 01 March 2014 04:28:32PM *  4 points [-]

i tend to express ideas tersely, which counts as poorly-explained if my audience is expecting more verbiage, so they round me off to the nearest cliche and mostly downvote me

i have mostly stopped posting or commenting on lesswrong and stackexchange because of this

like, when i want to say something, i think "i can predict that people will misunderstand and downvote me, but i don't know what improvements i could make to this post to prevent this. sigh."

revisiting this on 2014-03-14, i consider that perhaps i am likely to discard parts of the frame message and possibly outer message - because, to me of course it's a message, and to me of course the meaning of (say) "belief" is roughly what http://wiki.lesswrong.com/wiki/Belief says it is

for example, i suspect that the use of more intuitively sensible grammar in this comment (mostly just a lack of capitalization) often discards the frame-message-bit of "i might be intelligent" (or ... something) that such people understand from messages (despite this being an incorrect thing to understand)

Comment author: Kaj_Sotala 18 February 2014 07:06:38AM *  3 points [-]

naturalized!Cai

I'm not sure that using this notation is a good idea, given that at least some of the readers unfamiliar with it are likely to initially parse it as "naturalized not-Cai". Even I did for a brief moment, because I was parsing the writing using my logic!brain rather than my fanfiction!brain.

Comment author: alicey 19 February 2014 05:18:38AM *  0 points [-]

this is why i like ¬

script your keyboard! make it so that the chords ~1 and 1~ output a '¬'! or any other chord, really

if this actually sounds interesting and you use windows you can grab my script at https://github.com/alice0meta/userscripts/tree/master/ahk

Comment author: jpaulson 16 February 2014 07:09:20PM 1 point [-]

Most of your post is not arguments against curing death.

People being risk-averse has nothing to do with anti-aging research and everything to do with individuals not wanting to die...which has always been true (and becomes more true as life expectancy rises and the "average life" becomes more valuable). The same is true for "we should risk more lives for science".

I agree that people adapt OK to death, but I think you're poking a strawman; the reason death is bad is because it kills you, not because it makes your friends sad.

I think "death increases diversity" is a good argument. On the other hand, most people who present that argument are thrilled that life expectancy has increased to ~70 from ~30 in ancient history. Why stop at 70?

Comment author: alicey 16 February 2014 10:48:03PM *  2 points [-]

note: "life expectancy used to be ~30" is a common misconception (it's being skewed by infant mortality) (life expectancy has gone up a lot, just not that much)

(as far as i know. i've been told that it's a common misconception that this is a common misconception, but they refused to cite sources)

Comment author: alicey 16 February 2014 03:08:36PM *  -1 points [-]

short response is "yeah, sure, sorta ... but only if you're a stupid group. we can do better."

edit: http://lesswrong.com/lw/jop/a_defense_of_senexism_deathism/akk3 is the longer version of this response

View more: Prev | Next