Comment author: NancyLebovitz 29 December 2011 06:26:48PM 2 points [-]

Something to think about if you have a goal of losing weight. How do you decide whether a goal makes sense?

Comment author: _ozymandias 29 December 2011 06:51:19PM 0 points [-]

Interesting article!

I presume that "I realized this goal was irrational and switched to a different goal that would better achieve my values" would also be a victory for instrumental rationality...

Comment author: MileyCyrus 29 December 2011 04:34:34AM *  1 point [-]

However, in the great football game of feminists and men's rights advocates, I'm pretty much on Team Feminism, which is why I get so upset when it's clearly doing things wrong.

What I meant is that you actually demand results from your team, instead of giving them a free pass just because they have a certain label.

Comment author: _ozymandias 29 December 2011 04:59:01AM 0 points [-]

Ah, thank you. I misunderstood. :) I've had a few problems with people being confused about why my blog uses so much feminist dogma if it's a men's rights blog, so I'm hyper-sensitive about being mistaken for a non-feminist.

Comment author: MileyCyrus 29 December 2011 03:50:29AM 4 points [-]

Her blog is good. Instead of blindly cheering for a side in the feminism vs men's-rights football game, Ozymandias actually tries to understand the problem and recommend workable solutions.

Comment author: _ozymandias 29 December 2011 04:20:52AM 4 points [-]

Thank you very much, Miley! I tend to view feminism and men's rights as being inherently complementary: in general, if we make women more free of oppressive gender roles, we will tend to make men more free of oppressive gender roles, and vice versa. However, in the great football game of feminists and men's rights advocates, I'm pretty much on Team Feminism, which is why I get so upset when it's clearly doing things wrong.

Also, my pronoun is zie, please. :)

Comment author: ksvanhorn 29 December 2011 03:43:36AM 1 point [-]

Instrumental rationality is the focus we have in mind -- doing the things that most enhance your personal utility. Avoiding cognitive biases and having beliefs that match reality better are means to better instrumental rationality, but not the end. Some of the things that I think would fall under instrumental rationality would be better decisions (the ones important enough to merit some analyzing), determining what habits would be good to acquire or discard, and overcoming akrasia. I think we would have to start highly focused on one of these areas and a specific target market, and branch out over time.

As to how to test benefit of the training... I've put that on my list of questions to consider. I don't know the answer right now. But anything that has an observable effect of some sort will be measurable in some fashion.

Comment author: _ozymandias 29 December 2011 04:11:28AM *  1 point [-]

To a certain degree one could test instrumental rationality indirectly. Perhaps have them set a goal they haven't made much progress on (dieting? writing a novel? reducing existential risk?) and see if instrumental rationality training leads to more progress on the goal. Or give people happiness tests before and a year after completing the training (i.e. when enough time has passed that the hedonic treadmill has had time to work). Admittedly, these indirect methods are incredibly prone to confounding variables, but if averaged over a large enough sample size the trend should be clear.

Comment author: _ozymandias 29 December 2011 03:29:25AM 1 point [-]

I think the most important thing about a rationality training service is operationalizing what is meant by rationality.

What exact services would the rationality training service provide? Would students have beliefs that match reality better? Be less prone to cognitive biases? Tend to make decisions that promote greater utility (for themselves or others)? How would you test this? Martial arts dojos tend to (putting it crudely) make their students better at hitting things than they were before; that's a lot easier to objectively measure than making students better at thinking than they were before.

I personally would not pay for a rationality training service unless it provided clear, non-anecdotal evidence that the average person received some benefit. I'd be particularly concerned about whether the service actually taught people to think more clearly, or simply inculcated them with the views of the people running the service.

Comment author: mwengler 28 December 2011 06:51:33PM 5 points [-]

Do you think lesswrong.com has a dogma which is somewhat complex and engenders a group identity? Cryonics anybody? Religious tolerance around here consists of a truce between the one-boxers and the two-boxers. I was told just the other day that if I didn't think sentience arises from the operation of a turing machine that I was probably in the wrong place.

How about the academic world of PhD physicists? Their dogma is not expensive because it is "wrong" but rather it is expensive because it requires great and specialized skills and knowledge to manipulate it. It results in people who characteristically have different opinions than their complementary group, and who tend to trust each other in certain arenas more than they would trust non-phd physicists.

I suspect that this affiliation through agreed-upon dogma is more a feature than a bug, more a mechanism for creating larger more complex creations through uniting the efforts of many people. For the groups we agree with we prefer not to notice that they have some real features in common with the groups we don't agree with, but I submit that our tribe dogma should include a recognition of that fact (and actually, I think it mostly does.)

Comment author: _ozymandias 28 December 2011 11:25:48PM 3 points [-]

I think the distinction is not between logical and illogical ideas, but between high-cost and low-cost ideas.

Illogical ideas are generally high-cost, for the reasons outlined in the OP, unless you live in a society in which everyone accepts the high-cost idea (for instance, Creationism in the American South). Cryonics is a high-cost idea: it may be right, but it is also deeply weird and unlikely to find acceptance among non-transhumanists. PhD physicists have high-cost ideas because of the time and effort required to understand them. Even jargon might count as a high-cost idea because of the price you pay in ease of communication, especially jargon that those outside the group tend to understand differently than those inside the group (for instance, feminists tend to use patriarchy to mean "the system of institutionalized societal sexism", while most non-feminists interpret it as meaning "all men oppressing all women").

Of course, all this is purely speculative. And the causation might go the other way: instead of adopting a high-cost idea signalling one's membership in the group, it might be that high-cost ideas tend to create groups, because low-cost ideas tend to be adopted by large numbers of people.

Comment author: Jayson_Virissimo 28 December 2011 04:19:38PM *  2 points [-]

I am considering this and am wondering if it would be worth it to do a separate Less Wrong read through or if I should just go along with the Redditors. Any thoughts?

Comment author: _ozymandias 28 December 2011 05:00:13PM 2 points [-]

l'd like a separate Less Wrong readthrough because I don't have a Reddit account and don't want to acquire one for the sole purpose of the readthrough (because then I'll comment on Reddit, and I have quite enough time-wasting things to do on the Internet already :) ).

Comment author: windmil 28 December 2011 03:59:55AM 0 points [-]

The only LWer that I've noticed was from Florida! (Of course, people don't too frequently pepper their posts with particulars of their placement.)

Comment author: _ozymandias 28 December 2011 05:21:36AM 1 point [-]

Where are you? I'm in Fort Lauderdale and the Tampa area. If we're near each other maybe we could arrange one of those meetup thingies...

Comment author: _ozymandias 27 December 2011 10:57:18PM 6 points [-]

I'm another classic brilliant-at-age-ten kid. The biggest problem I experienced related to being considered smart rather young was that a lot of my sense of self-worth got tied up in being the smartest kid in the room. This is suboptimal-- not only does it lead to the not asking stupid questions issue, but it also means that as soon as I was in a situation in which I wasn't smart about something, I felt like I had no worth as a human being whatsoever. (Possible confounding variable: I had depression.)

The closest thing to a solution I've found is to try to derive my self-worth from multiple sources. I am worth something as a human being not simply because of intellectual achievements, but also because I have friends who like me, I give to charity, I refused to give up. I don't know how well this will work for other people, though.

The other big problem I encountered is that I tended to automatically give up if I wasn't immediately good at something; this is why, among other reasons, I have a roughly ninth-grade understanding of math, even though I've taken calculus. (I've read studies that suggest that that's common among children praised for traits instead of actions; I'm away from JSTOR and my psych textbooks at the moment, but if someone would like a citation then I can dig it up in a week or so.) My solution was to grade myself on process instead of achievement: I defined success not as "learning two new songs" but as "practicing guitar for half an hour every week." My other solution was to work to overcome the ugh fields around activities I'm generally not good at, and to redefine those fields within my brain as "cool stuff I haven't learned yet", not "stuff I can't do."

Comment author: _ozymandias 27 December 2011 05:32:24PM 22 points [-]

Hi everyone! I'm Ozy.

I'm twenty years old, queer, poly, crazy, white, Floridian, an atheist, a utilitarian, and a giant geek. I'm double-majoring in sociology and psychology; my other interests range from classical languages (although I am far from fluent) to guitar (although I suck at it) to Neil Gaiman (I... can't think of a self-deprecating thing to say about my interest in Neil Gaiman). I use zie/zir pronouns, because I identify outside the gender binary; I realize they're clumsy, but English's lack of a good gender-neutral pronoun is not my fault. :)

One of my big interests is the intersection between rationality and social justice. I do think that a lot of the -isms (racism, sexism, ableism, etc.) are rooted in cognitive biases, and that we're not going to be able to eliminate them unless we understand what quirks in the human mind cause them. I blog about masculism (it is like feminism! Except for dudes!) at No Seriously What About Teh Menz; right now it's kind of full of people talking about Nice-Guy-ism, but normally we have a much more diverse front page. I believe that several of the people here read us (hi Nancy! hi Doug! hi Hugh, I like you, when you say I'm wrong you use citations!).

I've lurked here for more than a year; I got here from Harry Potter and the Methods of Rationality, just like everyone else. I've made my way through a lot of the Sequences, but need to set aside some time to read through all of them. I don't know much about philosophy, math, science, or computers, so I imagine I will be lurking here a lot. :)

View more: Prev | Next