Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Meetup : Rationality Potluck

0 MathieuRoy 25 May 2017 06:28PM

Discussion article for the meetup : Rationality Potluck

WHEN: 25 May 2017 06:30:00PM (-0400)

WHERE: 1191 Avenue Hope, Montreal

Eric Chisholm from the Vancouver Rationalist Community is staying at the Macroscope this week. You're invited to come say hi, and talk with other rationality enthusiasts! Feel free to invite friends.

Bring food and/or beverage if possible. Vegan food will be available.

Eric Chisholm, an alumnus from the Center for Applied Rationality, will present the Double Crux technique: a technique for resolving disagreement.

Facebook Event : https://goo.gl/f8Uwfg

Discussion article for the meetup : Rationality Potluck

[Link] Wikipedia book based on betterhumans' article on cognitive biases

1 MathieuRoy 14 October 2016 01:03AM
Comment author: MathieuRoy 17 January 2015 05:33:04AM *  0 points [-]
  • jumpsuits/onepieces (I find them really comfortable)
  • if you don't have a lot of dishes (ex.: live alone), something like this to avoid putting your hands in hot water, and with soap in the handle to be more efficient
  • a second pillow to put between or below your legs when you sleep
Comment author: MathieuRoy 11 December 2014 05:43:34AM 2 points [-]

David Pizer started a petition to promote more anti-aging research.

"In 40 to 100 years, if the world governments spent money on research for aging reversal instead of for research on building weapons that can kill large numbers of people, world scientists could develop a protocol to reverse aging and at that time people could live as long as they wanted to in youthful, strong, healthy bodies."

To sign the petition, go here

Comment author: Adele_L 06 November 2014 06:42:45AM *  7 points [-]

It's well known that men are better at mental rotation and other forms of spatial reasoning than women. I've always been pretty good at it - my default technique is to carefully check the relations (i.e. count the number of cubes in the segment, note the relative angle of the joint, and make sure they match). It was only recently that I realized that some people actually just rotated it in their head, and 'looked' to see if it was the same.

Anyway, I was wondering if maybe the technique used was correlated with gender.

What sex were you assigned at birth?

With what gender do you primarily identify?

What method do you use to do mental rotations?

(Something else}

Submitting...

Comment author: MathieuRoy 06 November 2014 11:13:48PM 0 points [-]

Telling in advance what results you expect change the results for many reasons (ex.: the pygmalion effect, the golem effect, the stereotype threat, etc.).

Comment author: MathieuRoy 27 October 2014 06:52:53AM *  30 points [-]

Done it. The whole thing! (edit: except the last question)

Comment author: NancyLebovitz 15 October 2014 07:36:12AM 1 point [-]

I think most people's desired life span has a lot to do with how healthy they expect to be.

Comment author: MathieuRoy 16 October 2014 02:49:40AM 0 points [-]

Good point, I edited the post to make that clear.

Comment author: Vulture 15 October 2014 07:35:36PM 0 points [-]

death = permanently not conscious; if you create a clone or a simulation that is not a direct upload, it doesn't count as 'still living'

I understand that it's part of the framing of the question, but I still think that a lot of people would take issue with this part.

Comment author: MathieuRoy 16 October 2014 02:36:19AM 0 points [-]

Is it because a lot of people think that continuing to live as a clone or a simulation is just as good as continuing to live as the original? If so, then I don't mind rephrasing what I mean by death. The important point is that I don't mean the death of the body, but rather the death of the mind.

Comment author: blacktrance 14 October 2014 11:21:05PM 1 point [-]

Do nihilists think they have no goals (aka terminal values) or do nihilists think they don't have goals about fulfilling others' goals or is it something else?

I am not a nihilist, and I don't know if I'd be able to pass an Ideological Turing Test as one, but to give my best answer to this, the nihilist would say that there are no moral oughts. How they connect this to terminal goals varies depending on the nihilist.

Ok so would that be right to say this?: Utilitarianism is giving equal weight to everyone's utility function (including yours) in your "meta" utility function. Egoism means you don't consider others' utility function in your utility function.

The first part, kind of, the second part, no. The utilitarian holds that the right thing to do is determined by what maximizes world utility, which is produced by utility functions. All utility, including your own, is given equal weight in the "moral decision" function. As for egoism, it simply means that you consider others' utility functions to the degree that they're a part of your utility function. It doesn't mean that you disregard them altogether.

Comment author: MathieuRoy 14 October 2014 11:37:58PM *  1 point [-]

Ok thanks for your answers!

Comment author: blacktrance 14 October 2014 05:22:32AM *  4 points [-]

That is an inaccurate definition of nihilism because it doesn't match what nihilists actually believe. Not only do they reject intrinsic morality, they reject all forms of morality altogether. Someone who believes in any kind of moral normativity (e.g. a utilitarian) cannot be a nihilist.

Utilitarianism is used as "the normative ethical theory that one ought to maximize the utility of the world". This is in contrast to something like egoism ("the normative ethical theory that one ought to maximize one's own utility") and other forms of consequentialism.

Comment author: MathieuRoy 14 October 2014 09:51:20PM *  0 points [-]

Thank you for your answer.

Do nihilists think they have no goals (aka terminal values) or do nihilists think they don't have goals about fulfilling others' goals or is it something else?

Utilitarianism is used as "the normative ethical theory that one ought to maximize the utility of the world".

Ok so would that be right to say this?: Utilitarianism is giving equal weight to everyone's utility function (including yours) in your "meta" utility function. Egoism means you don't consider others' utility function in your utility function.

And then there is everything in-between (meaning giving more weight to your utility function than to other's utility function in your "meta" utility function).

View more: Next