Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Bound_up 22 March 2017 01:36:56PM 2 points [-]

Maybe there could be some high-profile positive press for cryonics if it became standard policy to freeze endangered species seeds or DNA for later resurrection

Comment author: Bound_up 20 March 2017 11:49:25PM 0 points [-]

Suppose there are 100 genes which figure into intelligence, the odds of getting any one being 50%.

The most common result would be for someone to get 50/100 of these genes and have average intelligence.

Some smaller number would get 51 or 49, and a smaller number still would get 52 or 48.

And so on, until at the extremes of the scale, such a small number of people get 0 or 100 of them that no one we've ever heard of or has ever been born has had all 100 of them.

As such, incredible superhuman intelligence would be manifest in a human who just got lucky enough to have all 100 genes. If some or all of these genes could be identified and manipulated in the genetic code, we'd have unprecedented geniuses.

Comment author: Alicorn 17 March 2017 01:46:56AM 19 points [-]

If you like this idea but have nothing much to say please comment under this comment so there can be a record of interested parties.

Comment author: Bound_up 17 March 2017 01:27:56PM 1 point [-]

Absolutely. I've been looking into the different places looking to do something like this (like the Accelerator Project). Would definitely be interested in any similar things going on

Comment author: Bound_up 14 March 2017 12:49:12AM 0 points [-]

The not-"rational" (read "not central to the rationalist concept cluster in the mind/not part of the culture of rationalists"), but rational things we need to do.

The value of pretending, self-talk, I mention in another comment. The value of being nice is another not strongly associated with "rationalism," but which is, I think, rational to recognize.

There are others. Certain kinds of communication. Why can't any "rationalists" talk? The best ones are so wrapped up in things that betray their nerd-culture association that they are only appealing to other nerds; you can practically identify people who aren't "rationalists" by checking if they sound nerdy or not. There's probably a place for sounding a lot more like Steve Harvey or a pastor or politician if there's any place for effectively communicating with people who aren't nerds.

There are other anti-rationalist-culture things we should probably look for and develop

Comment author: Bound_up 11 March 2017 02:53:44PM 4 points [-]

On the Value of Pretending

Actors don't break down the individual muscle movements that go into expression; musicians don't break down the physical properties of the notes or series of notes that produce expression.

They both simulate feeling to express it. They pretend to feel it. If we want to harness confidence, amiability, and energy, maybe there's some value in pretending and simulating (what would "nice person" do?).

Cognitive Behavioral Therapy teaches that our self-talk strongly affects us, counseling us not to say "Oh, I suck" kind of things. Positive self-talk "I can do this" may be worth practicing.

I'm not sure why, but this feels not irrational, but highly not-"rational" (against the culture associated with "rationality."). This also intrigues me...

Comment author: Bound_up 11 March 2017 01:45:10PM 1 point [-]

A charity is a business who sells feeling good about yourself and the admiration of others as its products.

To make a lucrative product, don't ask "what needs need filling," ask "what would help people signal more effectively."

Comment author: I_D_Sparse 11 March 2017 09:17:08AM *  1 point [-]

Regarding instrumental rationality: I've been wondering for a while now if "world domination" (or "world optimization", as HJPEV prefers) is feasible. I haven't entirely figured out my values yet, but whatever they turn out to be, WD/WO sure would be handy for achieving them. But even if WD/WO is a ridiculously far-fetched dream, it would still be a very good idea to know one's approximate chances of success with various possible paths to achieving one's values. I have therefore come up with the "feasibility problem." Basically, a solution to the problem consists of an estimation of how much one can actually hope to influence the world, and to what extent one can actually fulfill one's values. I think it would be very wise to solve the feasibility problem before attempting to take over the world, or become the President, or lead a social revolution, or improve the rationality of the general populace, etc.

Solving the FP would seem to require a deep understanding of how the world operates (anthropomorphically speaking, if you get my drift; I'm talking about the hoomun world, not physics and chemistry).

I've even constructed a GPOATCBUBAAAA (general plan of action that can be used by any and all agents): first, define your utility function, and also learn how the world works (easier said than done). Once you've completed that, you can apply your knowledge to solve the FP, and then you can construct a plan to fulfill your utility function, and then put it into action.

This is probably a bit longer than 100 words, but I'm posting it here and not in the open thread because I have no idea if it's of any value whatsoever.

Comment author: Bound_up 11 March 2017 01:41:45PM 1 point [-]

Am I reading this right as, basically, crack the alignment problem manually, and then finish science (then proceed to take over the world)?

Comment author: bogus 10 March 2017 08:18:18PM 2 points [-]

Here, you can post quick little (preferably under 100 words) insights, with the explicit understanding that the idea is unpolished and unvetted, and hey, maybe nobody will be interested, but it's worth a shot.

My quick unpolished insight is that we already have the weekly Open Thread for this sort of thing. But thanks for the reminder, anyway.

Comment author: Bound_up 11 March 2017 04:22:03AM 1 point [-]

While this idea may not be of interest to everybody, it has already been vetted by the Open Thread

Comment author: I_D_Sparse 10 March 2017 07:43:43PM *  0 points [-]

This is an interesting idea, although I'm not sure what you mean by

It can work without people understanding why it works

Shouldn't the people learning it understand it? It doesn't really seem much like learning otherwise.

Comment author: Bound_up 11 March 2017 04:21:06AM 0 points [-]

You don't have to understand what it does or why it works (or care about those) to successfully perform it. You can put yourself in the other side's shoes without understanding the effects of doing so.

Comment author: Bound_up 10 March 2017 02:26:16PM *  5 points [-]

I had an idea to increase average people’s rationality with 4 qualities:

It doesn’t seem/feel “rationalist” or “nerdy.”

It can work without people understanding why it works

It can be taught without understanding its purpose

It can be perceived as about politeness

A high school class, where people try to pass Intellectual Turing Tests. It's not POLITE/sophisticated to assert opinions if you can't show that you understand the people that you're saying are wrong.

We already have a lot of error-detection abilities when our minds criticize others' ideas, we just need to access that power for our own ideas.

Comment author: Bound_up 10 March 2017 02:26:59PM 0 points [-]

This is about 100 words, in case you want to get a feel for the length

View more: Next