Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Liron 14 May 2017 06:45:42AM 0 points [-]

By "reality-controlled", I don't just mean "external reality", I mean the part of external reality that your belief claims to be about.

Understanding truth in terms of "correspondance" brings me noticeably closer to coding up an intelligent reasoner from scratch than those other words.

The simple truth is that brains are like maps, and true-ness of beliefs about reality is analogous to accuracy of maps about territory. This sounds super obvious, which is why Eliezer called it "The Simple Truth". But it runs counter to a lot of bad philosophical thinking, which is why Eliezer bothered writing it.

Comment author: TheAncientGeek 14 May 2017 12:22:37PM 0 points [-]

Understanding truth in terms of "correspondance" brings me noticeably closer to coding up an intelligent reasoner from scratch than those other words.

If the correspendence theory cannot handle maths or morals, you will end up with a reasoner that cannot handle maths or morals.

The simple truth is that brains are like maps, and true-ness of beliefs about reality is analogous to accuracy of maps about territory.

You need to show that that simple theory also deals with the hard cases....because EY didn't.

But it runs counter to a lot of bad philosophical thinking, which is why Eliezer bothered writing it.

It's a piece of bad thinking that runs counter to philosophy. You don't show that something works in all cases by pointing out, however loudly or exasperatedly, that it works in the easy cases ,where it is already well known to work.

Comment author: Liron 15 June 2011 05:13:45PM 8 points [-]

Here are the main points I understood:

The only way you can be sure your mental map accurately represents reality is by allowing a reality-controlled process to draw your mental map.

A sheep-activated pebble-tosser is a reality-controlled process that makes accurate bucket numbers.

The human eye is a reality-controlled process that makes accurate visual cortex images.

Natural human patterns of thought like essentialism and magical thinking are NOT reality-controlled processes and they don't draw accurate mental maps.

Each part of your mental map is called a "belief". The parts of your mental map that portray reality accurately are called "true beliefs".

Q: How do you know there is such a thing as "reality", and your mental map isn't all there is? A: Because sometimes your mental map leads you to make confident predictions, and they still get violated, and the prediction-violating thingy deserves its own name: reality.

Comment author: TheAncientGeek 11 May 2017 12:30:43PM *  0 points [-]

The only way you can be sure your mental map accurately represents reality is by allowing a reality-controlled process to draw your mental map.

Everything is reality, so that is a distinction that doesn't make a difference. All illusions and errors are produced by real processes. (Or is "reality" being used to mean "external reality").

The human eye is a reality-controlled process that makes accurate visual cortex images.

Sometimes. But being reality controlled isn't a good criterion for when, since it is never false.

Natural human patterns of thought like essentialism and magical thinking are NOT reality-controlled processes and they don't draw accurate mental maps.

They are perfomed by real brains. if "reality controlled" just means producing the right results, the whole argument is circular.

Each part of your mental map is called a "belief". The parts of your mental map that portray reality accurately are called "true beliefs".

Why is it important to taboot the words "accurate", "correct", "represent", "reflect", "semantic", "believe", "knowledge", "map", or "real"., but not the word "correspond"?

Comment author: MrMind 15 June 2011 10:50:36AM 1 point [-]

Here Elizier said:

"The Simple Truth" was generated by an exercise of this discipline to describe "truth" on a lower level of organization, without invoking terms like "accurate", "correct", "represent", "reflect", "semantic", "believe", "knowledge", "map", or "real".

Comment author: TheAncientGeek 11 May 2017 12:22:05PM 0 points [-]

or "correspond?"

Comment author: TheAncientGeek 08 May 2017 01:41:54PM 0 points [-]

There's this guy named Eliezer Yudkowsky. He's really, really smart.

Who told you that?

Comment author: TheAncientGeek 08 May 2017 01:25:07PM 1 point [-]

Nitpick: cash for cache,.

Comment author: RomeoStevens 03 May 2017 06:19:01PM *  11 points [-]

Having spent years thinking about this and having the opportunity to talk with open minded, intelligent, successful people in social groups, extended family etc. I concluded that most explicit discussion of the value of inquiring into values and methods (scope sensitivity and epistemological rigor being two of the major threads of what applied rationality looks like) just works incredibly rarely, and only then if there is strong existing interest.

Taking ideas seriously and trusting your own reasoning methods as a filter is a dangerous, high variance move that most people are correct to shy away from. My impression of the appeal of LW retrospectively is that it (on average) attracted people who were or are under performing relative to g (this applies to myself). When you are losing you increase variance. When you are winning you decrease it.

I eventually realized that what I was really communicating to people's system 1 was something like "Hey, you know those methods of judgment like proxy measures of legitimacy and mimesis that have granted you a life you like and that you want to remain stable? Those are bullshit, throw them away and start using these new methods of judgment advocated by a bunch of people who aren't leading lives resembling the one you are optimizing for."

This has not resulted in many sales. It is unrealistic to expect to convert a significant fraction of the tribe to shamanism.

Comment author: TheAncientGeek 08 May 2017 11:29:32AM 0 points [-]

My impression of the appeal of LW retrospectively is that it (on average) attracted people who were or are under performing relative to g (this applies to myself). When you are losing you increase variance. When you are winning you decrease it.

There's also the issue of having plenty of spare time·

Comment author: vasaka 29 April 2017 04:19:49AM *  0 points [-]

I think I can show how probability is not purely in the mind but also an inherent property of things, bear with me.

Lets take an event of seeing snow outside, for simplicity we know that snow is out there 3 month a year in winter, that fact is well tested and repeats each year. That distribution of snowy days is property of the reality. When we go out of bunker after spending there unknown amount of time we assign probability 1/4 to seeing a snow, and that number is function of our uncertainty about the date and our precise knowledge of when snow is out there. 1/4 is a precise description of reality if our scope is not just one day but a whole year. In this case we have a precise map, and our uncertainty is lack of knowledge of our place on the map. What we also know that if we do not have a date or season there is no better prediction and this is a property of things too.

Additionally having probability distribution you can perfectly predict accumulated effect of series of events, and this ability to predict something precisely is an indication that you grasped something about reality.

Returning to the coin 0.5 prediction of one throw is function of our uncertainty, but our prediction of sum of long series where 1 is heads and 0 is tails is a result of our knowledge of coin properties that is expressed as probability

Comment author: TheAncientGeek 29 April 2017 01:30:57PM 0 points [-]

A statistical distribution is objective, and can be an element in a probability calculation, but is not itself probability.

Comment author: arundelo 27 April 2017 11:20:38PM 0 points [-]

Eliezer probably means "sapient":

"Sentience is commonly used in science fiction and fantasy as synonymous with sapience, although the words aren't synonyms."

(Or maybe by "is sentient", he means to say, "is a person in the moral sense".)

Comment author: TheAncientGeek 28 April 2017 06:36:41AM 0 points [-]

Well, sentient means feeling and sapient means knowing, and that's about all there is to it...neither term is technical precise, although they are often bandied around as though they are.

Comment author: TheAncientGeek 27 April 2017 06:55:04AM *  1 point [-]

The normal control problem assumes that no specific agency in the programs (especially not super-intelligent agency)

There seems to be a verb missing in that sentence...did you mean ...assumes that there is no specific agency in the programs...?

(Nitpicks aside, I think this is the right approach...build current safety and control knowledge, rather than assume thjat all furure AIs will follow some very specific decision theory).

Comment author: TheAncientGeek 27 April 2017 06:12:18AM 2 points [-]

Now, there isn’t a perfect consensus on these issues. For probability, there’s the debate between Bayesians and frequentists. I may think the Bayesian perspective is superior, and points to a specific understanding of randomness as a subjective phenomenon (so randomness and uncertainty are really the same thing).

Funny how the adoption of a particular framework of probability theory, for pragmatic reasons can prove something about reality.

View more: Next