Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: UmamiSalami 01 August 2017 01:41:30AM *  1 point [-]

I think it's about a 0.75 probability, conditional upon smarter-than-human AI being developed. Guess I'm kind of an optimist. TL;DR I don't think it will be very difficult to impart your intentions into a sufficiently advanced machine.

Comment author: The_Jaded_One 04 August 2017 01:12:13AM *  0 points [-]

I don't think it will be very difficult to impart your intentions into a sufficiently advanced machine

Counterargument: it will be easy to impart an approximate version of your intentions, but hard to control the evolution of those values as you crank up the power. E.g. evolution, humans, make us want sex, we invent condoms.

No-one will really care about this until it's way too late and we're all locked up in nice padded cells and drugged up, or something equally bad but hard for me to imagine right now.

Comment author: The_Jaded_One 04 August 2017 01:08:38AM 0 points [-]

I think 50% is a reasonable belief given the very limited grasp of the problem we have.

Most of the weight on success comes from FAI being quite easy, and all of the many worries expressed on this site not being realistic. Some of the weight for success comes from a concerted effort to solve hard problems.

Comment author: HungryHippo 01 August 2017 08:03:49PM 1 point [-]

Your link doesn't lead anywhere. :-)

Comment author: The_Jaded_One 01 August 2017 09:11:24PM 1 point [-]

I guess there is a gap between the OP's intention and his/her behaviour? Intended to link to something but actually just self-links?

Comment author: oge 27 July 2017 03:58:45PM 1 point [-]

Malnutrition is the visible surface symptom of "these are uncivilised, backwards people caught in a series of petty tribal wars".

I agree.

Could you tell me how you came about the list of African backward values? I currently live in an African country; I'd like the names of all the values I'd need to instil to avoid seeing preventable suffering around me.

(FYI I'd thought that having a public list of salaries and paying higher taxes, a la Norway, would be mostly sufficient to fix things)

Comment author: The_Jaded_One 01 August 2017 03:28:36PM 0 points [-]

Thanks for your comment! Can you say which country?

Could you tell me how you came about the list of African backward values?

Not in particular, the human brain tends to collect overall impressions rather than keep track of sources.

I'd like the names of all the values I'd need to instil to avoid seeing preventable suffering around me.

This sounds like a seriously tough battle.

Comment author: Viliam 21 June 2017 11:44:29AM 2 points [-]

+1 nice comment; funny and insightful

Things like this are nice when made rarely, and horrible when they become the norm. How to prevent that? Have a limited number of +1's per user per week? (Or per total karma?) Making them a scarce resource could make them even more valuable...

Comment author: The_Jaded_One 24 June 2017 12:26:50PM 0 points [-]

Yeah, I mean maybe just make them float to the bottom?

Comment author: The_Jaded_One 20 June 2017 08:57:48PM *  3 points [-]

One problem here is that we are trying to optimize a thing that is broken on an extremely fundamental level.

Rationality, transhumanism, hardcore nerdery in general attracts a lot of extremely socially dysfunctional human beings. They also tend to skew towards a ridiculously biologically-male-heavy gender distribution.

Sometimes life throws unfair challenges at you; the challenge here is that ability and interest in rationality correlates negatively with being a well-rounded human.

We should search very hard for extreme out-of-the-box solutions to this problem.

One positive lead I have been given is that the anti-aging/life-extension community is a lot more gender balanced. Maybe LW should try to embrace that. It's not a solution, but that's the kind of thing I'm thinking of.

Comment author: The_Jaded_One 20 June 2017 08:47:34PM 2 points [-]

agree with that isn't just “+1 nice post.” Here are some strategies...

How about the strategy of writing "+1 nice post"? Maybe we're failing to see the really blatantly obvious solution here....

+1 nice post btw

Comment author: [deleted] 26 May 2017 08:43:41PM *  26 points [-]

a

Comment author: The_Jaded_One 14 June 2017 07:50:03PM 0 points [-]

someone was accidentally impregnated and then decided not to abort the child, going against what had previously been agreed upon, and proceeded to shamelessly solicit donations from the rationalist community to support her child

They were just doing their part against dysgenics and should be commended.

Comment author: [deleted] 26 May 2017 08:43:41PM *  26 points [-]

a

Comment author: The_Jaded_One 14 June 2017 07:47:28PM 0 points [-]

word is going around that Anna Salamon and Nate Soares are engaging in bizarre conspiratorial planning around some unsubstantiated belief that the world will end in ten years

Sounds interesting, I'd like to hear more about this.

Comment author: RomeoStevens 03 May 2017 06:19:01PM *  12 points [-]

Having spent years thinking about this and having the opportunity to talk with open minded, intelligent, successful people in social groups, extended family etc. I concluded that most explicit discussion of the value of inquiring into values and methods (scope sensitivity and epistemological rigor being two of the major threads of what applied rationality looks like) just works incredibly rarely, and only then if there is strong existing interest.

Taking ideas seriously and trusting your own reasoning methods as a filter is a dangerous, high variance move that most people are correct to shy away from. My impression of the appeal of LW retrospectively is that it (on average) attracted people who were or are under performing relative to g (this applies to myself). When you are losing you increase variance. When you are winning you decrease it.

I eventually realized that what I was really communicating to people's system 1 was something like "Hey, you know those methods of judgment like proxy measures of legitimacy and mimesis that have granted you a life you like and that you want to remain stable? Those are bullshit, throw them away and start using these new methods of judgment advocated by a bunch of people who aren't leading lives resembling the one you are optimizing for."

This has not resulted in many sales. It is unrealistic to expect to convert a significant fraction of the tribe to shamanism.

Comment author: The_Jaded_One 04 May 2017 05:49:36PM 0 points [-]

My impression of the appeal of LW retrospectively is that it (on average) attracted people who were or are under performing relative to g (this applies to myself). When you are losing you increase variance. When you are winning you decrease it.

This also applies to me

View more: Next