In response to The Allais Paradox
Comment author: Dr._Science 19 January 2008 06:52:41AM 1 point [-]

It's rational to take the certain outcome if gambling causes psychological stress. Notwithstanding that stress is intrinsically unpleasant, it increases your risk of peptic ulcers and stroke, which could easily cancel out the expected gain.

Comment author: ricketson 15 January 2012 07:37:24PM 1 point [-]

But such psychological stress arises from your perception of reality. If it is caused by an erroneous perception of reality, then the rational thing to do is correct your perception, not take the error for granted. If you are certain that you made the right decision, then you shouldn't feel stressed when you "lose".

In response to The Allais Paradox
Comment author: ricketson 15 January 2012 07:28:45PM *  0 points [-]

I initially chose 1A and 2B, but after reading the analysis of those decisions, I agree that they are inconsistent in a way that implies that one choice was irrational (in the context of this silly little game). So I did some introspection to figure out where I went wrong. Here's what I found:

1) I may have misjudged how small 1/34 is, and this only became apparent when the question was phased as it is in example 2.

2) I think I assumed an implicit costs in these gambles. The first cost is a delay in learning the outcome of these gambles; the second is the implicit need to work to earn this money. I think that these assumptions are reasonable because there is essentially no realistic condition in which I would instantly see the results of a decision that might earn me $27,000; there would probably be a delay of several months (if working) or years (if investing) between making the decision and learning whether I got the money or not. This prolonged uncertainty has a negative utility, since I am unable to make firm plans for the money during that interval. This negative utility would apply to all options except 1A. Furthermore, earning $24,000 would realistically require several months of work on my part. However, a project that had a 1/3 chance of paying out $24,000 might only take a month. The implicit difference in opportunity cost between scenario 1 and scenario 2 has implications for the marginal utility of money in each scenario (making me more risk-averse in scenario 1, which implicitly has a higher opportunity cost).

These implicit costs are not specified in this game, so it is technically "irrational" to incorporate them into my decision-making. However, in any realistic scenario, such costs will exist (regardless of what the salesman says), so it is good that I/we intuitively include them in my/our decision-making.

Comment author: ricketson 06 August 2011 05:32:27PM 9 points [-]

These are good insights on how communities function... but I'm a bit lost.

What is the purpose of a "rationalist community?"

I'm a bit new here, and often the essays seem to rest upon some prior understanding that I do not have. For this particular article, there seems to be some previous discussion about rationalist communities and why they are desirable... but I don't see it in the linked articles or on the main page.

So can you tell me why I would want to participate in a "rationalist parenting club" rather than a regular one. Why not engage with mainstream institutions and try to make them as pro-reason as possible?

In response to Simpson's Paradox
Comment author: ricketson 18 January 2011 03:16:29AM 2 points [-]

Randomization of test subjects...

I've had this in the back of my mind for the past week, and finally put my finger on how this problem is solved in most experimental sciences. Sorry if I've overlooked part of the discussion, but the typical solution to this problem is to randomly assign subjects to the two groups in the experiment. That way, it is very unlikely that some underlying variable will distort the outcome in the way that the sex of the subjects did in the above example, where the women were concentrated in the A group and men in the B group.

Of course, you can't always randomize assignment to the control and treatment group, but you could in the example given (testing a medical intervention).

Comment author: ricketson 13 August 2010 01:44:44AM 7 points [-]

Hi. I just joined the site yesterday to post a comment. I've been tracking the feed for about a week, having recently decided to re-engage with the Internet. I learned of the site about three months ago, by way of a blogger who was blogging about social issues. I disagreed with him very strongly on those issues, but I checked out his other posts and he mentioned a discussion over here (I think he's a participant).

I think that the post that originally attracted my attention was something relating to the singularity idea. Being a geek myself, I'm kinda interested in the "geek rapture", but haven't gotten a good sense of how people approach it (I know there's a book).

Anyway, I checked out the site: i liked the mission statement and the structure. Probably most importantly, the name stuck in my head. "Less Wrong" has a nice, calmly optimistic ring to it (kinda like Marginal Revolution, another blog I like). I really like how the site relies on user ratings. I've been a big fan of systems that have the community act as the gatekeeper, and have always jumped on board such projects (Wikipedia and Daily Kos, for example). I even once tried to set up a Wiki for debates, but it was very clunky and never got critical mass.

I've been participating in on-line political debates for about 15 years now. I think I've learned a lot, but I ofter get sick of the debates -- especially when they involve mainstream activists who just repeat the same tripe over and over again. I've also become rather cynical towards our political institutions. I don't really think that it matters what I think about politics -- if I'm not willing to make a career out of it, I'm not going to impact anything. I've decided to make my career as a scientist instead.

All of these futile political debates lead me to ask why people are so bad at thinking (or at least, expressing rational thoughts). I've always viewed politics as a means to an end -- that end being human happiness-- and I'm increasingly thinking that it is irrelevant to promoting that end. I'm thinking that the real issue is in how people think and solve problems. If people think right, the politics will sort itself out. So, I'm hoping that Less Wrong can provide a more productive discussion.

Comment author: ricketson 12 August 2010 01:44:17AM 14 points [-]

Hi. I'm new here. Great blog. Great post.

One maxim that I rely on for acting rationally is "know what your time is worth". In my first real job, I was on a one-week project with a scientist who told me that my time is valuable (I think he was implying that my boss was wasting my time). This really opened up my eyes. My first application of this idea was professionally -- I can get more out of my job than just a paycheck. I can learn skills and make contacts and list accomplishments that will advance my career. I can also enjoy what I do (I'm a researcher, so that's assumed in my profession). It's sad to see colleagues who think that their time is worth no more than some measly paycheck.

The second application of this rule was in my "home economy". I used to be very cheap. Now that I've placed a $ value on my time, it puts a lot of activities in perspective and I am much freer spending money when it frees up time for more worthwhile pursuits (it helps that my cheap habits assure that I always have a nice cushion of cash around. This way, I am able to spend money when needed, without reworking my budget -- which would be a real waste of my precious time). It's sad to see people earning $70,000 a year fretting over a dollar. It's also sad to see someone who has something big to contribute to society (such as a teacher or researcher, for example) worrying about how to recycle 1/10 ounce of plastic.

This rule ties in with the "comparative advantage" rule mentioned above.

The other maxim that I like is "question reality". It is basically a directive to question your own beliefs, ask "is this real?" It applies to everything, and it subsumes the traditional "Question authority" maxim, because unjust authority typically depends upon people being indoctrinated with a particular view of reality.

Thanks for reading. I look forward to participating in this site!

View more: Prev