Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Yosarian2 24 October 2017 10:35:20PM 0 points [-]

I certanly think you're right, that the conscious mind and conscious decisions can to a large extent re-write a lot of programming of the brain.

I am surprised to think that you think that most rationalists don't think that. (That sentence is a mouthful, but you know what I mean.) A lot of rationalist writing is devoted to working on ways to do exactally that; a lot of people have written about how just reading the sequences helped them basically repogram their own brain to be more rational in a wide variety of situations.

Are there a lot of people in the rationalist community who think that conscious thought and decision making can't do major things? I know there are philosophers who think that maybe consciousness is irrelevant to behavior, but that philosophy seems very much at odds with LessWrong-style rationality and the way people on LessWrong tend ot think about and talk about what consciousness is.

Comment author: entirelyuseless 21 October 2017 12:40:44PM 2 points [-]

"Be confused, bewildered or distant when you insist you can't explain why."

This does not fit the character. A real paperclipper would give very convincing reasons.

Comment author: Yosarian2 22 October 2017 02:50:54PM 3 points [-]

He's not a superhuman intelligent paperclipper yet, just human level.

Comment author: SquirrelInHell 12 October 2017 11:33:00AM 0 points [-]

"Well calibrated" isn't a simple thing, though. It's always a conscious decision of how willing you are to tolerate false positives vs false negatives.

I beg to differ; being well calibrated has a mathematically precise definition. E.g. if you are thinking of a binary suffering/not suffering classification (oversimplified but it's just to make a point), then I want my perception to assign such probabilities, that if you compare with true answers, cross-entropy is minimized. That's pretty much what I care about when I'm fixing my perception.

Of course there's the question of how aware at each moment you want to be of certain information. But you want to be well calibrated nonetheless.

if you think that almost everyone you see is suffering, you may be doing just that.

Or, you know, it's just simply true that people experience much more suffering than happiness. Also, they aren't so very aware of this themselves, because of how memories work.

Comment author: Yosarian2 12 October 2017 08:56:28PM 1 point [-]

Or, you know, it's just simply true that people experience much more suffering than happiness. Also, they aren't so very aware of this themselves, because of how memories work.

That certanly is not true of me or of my life overall, except during a few short periods. I don't have the same access to other people's internal state, but I doubt it is true of most people.

There certanly are a significant number people who it may be true of, people who suffer from depression or chronic pain or who are living in other difficult circumstances. I highly doubt that that's the majority of people, though.

Comment author: Gunnar_Zarncke 07 October 2017 09:25:04PM *  0 points [-]

In which Different World do you live?

The SSC article Different Worlds discussed how different people perceive the (same) world to be quite different places. Let's find out whether that is also the case for the limited LW population.

My prediction (based on the follow-up SSC post is that the

This poll is based on a poll I conducted with my four boys (ages 6 to 13) after reading the SSC article. I found it quite surprising how different even such a presumably homogeneous group perceives their environment.

This poll is structured into two parts:

1) The first part is about your environment; how you see the people in the world around you. 2) The second part asks the same questions about you; how you see yourself.

Please consider taking a break between both parts and cover your answers from the first part.

Part 1:

How much action do you perceive in your environment?

calm/silent active/loud

How mindful is your environment?

unfriendly friendly

How smart are people in your environment on average?

dumb intelligent

How good are people in general?

evil good

How does your environment deal with minorities and human and behavioral variety?

racist/enforce conformity embrace variety

How much are people together or do things together?

prefer solitude/isolated sociable/gregarious

How are decisions in your environment typically made?

objective/rational emotional/intuitive

With how much force are things typically done in your environment? How careful are communications?

rude/abrasive/direct soft/easygoing

How are things organised in your environment?

ad-hoc/spontaneous structured/planned

How does your environment deal with risks?

cautious/shy courageous/brave

.

.

Pause here

.

.

.

Part 2:

How active are you?

calm/silent active/loud

How mindful in your communication are you?

unfriendly friendly

How smart are you?

dumb intelligent

How good are you?

evil good

How do you deal with minorities and human and behavioral variety?

racist/enforce conformity embrace variety

How much do you prefer to do things with others?

prefer solitude/isolated sociable/gregarious

How do you make decisions?

objective/rational emotional/intuitive

With how much force do you act and communicate?

rude/abrasive/direct soft/easygoing

How organised are you?

ad-hoc/spontaneous structured/planned

How do you deal with risks?

cautious/shy courageous/brave


Notes:

When I did my evaluation I considered counting each point as roughly 1/2 standard deviation from the mean. I'm pretty sure I didn't stick to it though.

Differences to the poll I did with my children:

  • That poll had a numeric scale from -5 to +5 which I decided was not suitable for the LW poll format.
  • That poll was done with all of them together, so they heard each others answers.
  • I skipped the intelligence and benevolence questions for self rating explaining that a) talking about ones intelligence is often problematic and b) that everyone is the here of their own story.

I didn't change the order or direction of the questions. I think the choice of questions leaves something to be improved - I came up with them in a train ride with the boys.

Submitting...

Comment author: Yosarian2 09 October 2017 10:27:31PM 1 point [-]

Yeah, I'm not sure how to answer this. I would do one set of answers for my personal social environment and a completely different set of answers for my work environment, to such a degree that trying to just average them wouldn't work. I could pick one or the other.

Reference: I teach in an urban high school.

Comment author: Yosarian2 08 October 2017 03:58:29PM 5 points [-]

I didn't even know that the survey was happening, sorry.

If you do decide to keep running the survey for a little longer, I'd take it, if that data point helps.

Comment author: Yosarian2 07 October 2017 10:05:38PM *  1 point [-]

I think you need to try and narrow your focus on exactly what you mean by a "futurist institute" and figure out what specifically you plan on doing before you can think about any of these issues.

Are you thinking about the kind of consulting agency that companies get advice from on what the market might look like in 5 years and what technologies their competitors are using? Or about something like a think-tank that does research and writes papers with the intent on influencing political policy, and is usually supported by donations? Or an academic group, probably tied to a university, which publishes academic papers, similar to what Nick Bostrom does at Oxford? Or something that raises money primarily for scientific and technological research? Or maybe an organization similar to H+ that tries to spread awareness of transhumanist/ singularity related issues, publishes newsletters, has meetings, and generally tries to change people's minds about futurist, technological, AI, and/or transhumanist issues? Or something else entierly?

Basically, without more details about exactly what you are trying to do, I don't think anyone here is going to be able to offer very good advice. i suspect you may not be sure yourself yet, so maybe the first step is to try to think about the different options and try to narrow your initial focus a bit.

Comment author: SquirrelInHell 05 October 2017 04:43:51PM 1 point [-]

The best tradeoff is when you are well calibrated, just like with everything else.

In the "selfish default" you basically never have false positives, but also you have false negatives like all the time. So duh.

Comment author: Yosarian2 05 October 2017 08:45:44PM *  2 points [-]

The best tradeoff is when you are well calibrated, just like with everything else.

"Well calibrated" isn't a simple thing, though. It's always a conscious decision of how willing you are to tolerate false positives vs false negatives.

Anyway, I'm not trying to shoot you down here; I really did like your article, and I think you made a good point. Just saying that it's possible to have a great insight and still overshoot or over-correct for a previous mistake you've made, and if you think that almost everyone you see is suffering, you may be doing just that.

Comment author: SquirrelInHell 05 October 2017 08:19:55AM 2 points [-]

Wrong guess, both of you!

It's a specific skill that I have learned to execute pretty much at will.

I also have some of the opposite skill, which turns all of this off.

Since I learned them, my base level seems higher than before, but not in a life-affecting way.

What affects me is, rather, that the wall separating me from this other world is much thinner, and feels much more real somehow.

Comment author: Yosarian2 05 October 2017 09:27:51AM 3 points [-]

There has to be some kind of trade-off here between false positives and false negatives here, doesn't there? If you decide to "use that skill" to see more suffering, isn't it likely that you are getting at least some false positives, some cases where you think someone is suffering and they aren't?

Comment author: SquirrelInHell 05 October 2017 08:27:52AM *  1 point [-]

Hmm, interesting point!

On one hand, my intuition suggest that "happiness" is ill defined as a thing to do work here (sorry if it sounds annoyingly mysterious, I'm not sure what that'd mean exactly too!), and thinking in these terms can only take you so far.

OTOH, there's definitely some stuff you can do to push your "happiness baseline" around a little bit, and I think some people from rationality blogosphere had reports on this (Agenty Duck? can't find it).

Comment author: Yosarian2 05 October 2017 08:58:53AM *  1 point [-]

If "happiness" is too vague a term or has too many other meanings we don't necessarily want to imply, we could just say "positive utility". As in "try to notice when you or the people around you are experiencing positive utility".

I do think that actually taking note of that probably does help you move your happiness baseline; it's basically a rationalist version of "be thankful for the good things in your life". Something as simple as "you know, I enjoy walking the dog on a crisp fall day like this". Noticing when other people seem to be experiencing positive utility is also probably important in becoming a more morally correct utilitarian yourself, likely just as important as noting other people's suffering/ negative utility.

Comment author: Yosarian2 05 October 2017 01:53:19AM *  1 point [-]

Really interesting essay.

It also made me wonder if the opposite is also a skill you need to learn; do people need to learn how to see happiness when that happens around them? Some people seem strangely blind to happiness, even to their own.

View more: Next