ciphergoth comments on Welcome to Less Wrong! - Less Wrong

48 Post author: MBlume 16 April 2009 09:06AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1953)

You are viewing a single comment's thread. Show more comments above.

Comment author: ThomasRyan 02 February 2010 05:48:29PM *  5 points [-]

Hello.

Call me Thomas. I am 22. The strongest force directing my life can be called an extreme phobia of disorder. I came across overcoming bias and Eliezer Yudkowsky's writings, around the same time, in high school, shortly after reading GEB and The Singularity Is Near.

The experience was not a revelation but a relief. I am completely sane! Being here is solace. The information here is mostly systematized, which has greatly helped to organize my thoughts on rationality and has saved me a great amount of time.

I am good at tricking people into thinking I am smart, which you guys can easily catch. And I care about how you guys will perceive me, which means that I have to work hard if I want to be a valuable contributor. Something I am not used to (working hard), since I do good enough work with minimal effort.

My greatest vices are romantic literature, smooth language, and flowery writing. From Roman de la Rose, to The Knight's Tale, to Paradise Lost, to One Hundred Years of Solitude. That crap is like candy to me.

Bad music repulses me. I get anxious and irritable and will probably throw a fit if I don't get away from the music. Anything meticulous, or tedious, will make me antsy and shaky. Bad writing also has the same effect on me. Though, I am punctilious. There's a difference.

My favorite band it Circulatory System, which speaks directly to my joys and fears and hopes. If you haven't listened to them, I highly recommend you do so. The band name means "Human." It is about what is means to be us, about the circular nature of our sentience, and about the circles drawn in history with every new generation. http://www.youtube.com/watch?v=a_jidcdzXuU

I have opted out of college. I do not learn well in lectures. They are too slow, tedious, and meticulous. Books hold my attention better.

My biggest mistake? In school, never practicing retaining information. I do not have my months memorized and my vocabulary is terrible. It was much funner to use my intelligence to "get the grade" than it was to memorize information. Now, this is biting me on the butt. I need to start practicing memorizing stuff.

I am currently in a good situation. My mom got a job far from her house, and she has farm animals. I made a deal with her, where I watch her house and the animals for free if she lets me stay there. I will be in this position for at least another year.

I have enough web design skills to be useful to web design firms, which brings me my income. I am also a hobbyist programmer, though not good enough yet to turn that skill into money.

I want to teach people to be more rational; that's what I want to do with my life. I am far from being the writer I want to be, and I have not yet made my ideas congruent and clear.

Anybody with good recommendations on how to best spend this year?

Thomas.

Comment author: ciphergoth 02 February 2010 06:38:59PM 0 points [-]

Hello, and welcome to the site!

Comment author: ThomasRyan 02 February 2010 08:06:49PM *  1 point [-]

Thank you, I'll be seeing you around :) .

Anyway, I have been thinking of starting my year off by reading Chris Langan's CTMU, but I haven't seen anything written about it here or on OB. And I am very wary of what I put into my brain (including LSD :P).

Any opinions on the CTMU?

Comment author: ciphergoth 02 February 2010 08:17:45PM *  3 points [-]

Google suggests you mean this CTMU.

Looks like rubbish to me, I'm afraid. If what's on this site interests you, I think you'll get a lot more out of the Sequences, including the tools to see why the ideas in the site above aren't really worth pursuing.

Comment author: ThomasRyan 02 February 2010 08:51:33PM *  1 point [-]

Introduction to the CTMU

Yeah, I know what it looks like: meta-physical rubbish. But my dilemma is that Chris Langan is the smartest known living man, which makes it really hard for me to shrug the CTMU off as nonsense. Also, from what I skimmed, it looks like a much deeper examination of reductionism and strange loops, which are ideas that I hold to dearly.

I've read and understand the sequences, though I'm not familiar enough with them to use them without a rationalist context.

Comment author: Eliezer_Yudkowsky 02 February 2010 09:28:09PM 5 points [-]

But my dilemma is that Chris Langan is the smartest known living man, which makes it really hard for me to shrug the CTMU off as nonsense.

Eh, I'm smart too. Looks to me like you were right the first time and need to have greater confidence in yourself.

Comment author: Morendil 02 February 2010 09:55:01PM 1 point [-]

More to the point, you do not immediately fail the "common ground" test.

Pragmatically, I don't care how smart you are, but whether you can make me smarter. If you are so much smarter than I as to not even bother, I'd be wasting my time engaging your material.

Comment author: Eliezer_Yudkowsky 02 February 2010 10:05:18PM 3 points [-]

I should note that the ability to explain things isn't the same attribute as intelligence. I am lucky enough to have it. Other legitimately intelligent people do not.

Comment author: Morendil 02 February 2010 10:11:30PM 0 points [-]

If your goal is to convey ideas to others, instrumental rationality seems to demand you develop that capacity.

Comment author: Eliezer_Yudkowsky 02 February 2010 10:26:40PM 3 points [-]

Considering the extraordinary rarity of good explainers in this entire civilization, I'm saddened to say that talent may have something to do with it, not just practice.

Comment author: MrHen 02 February 2010 10:06:30PM 1 point [-]

I can learn from dead people, stupid people, or by watching a tree for an hour. I don't think I understand your point.

Comment author: Morendil 02 February 2010 10:22:12PM 0 points [-]

I didn't use the word "learn". My point is about a smart person conveying their ideas to someone. Taboo "smart". Distinguish ability to reach goals, and ability to score high on mental aptitude tests. If they are goal-smart, and their goal is to convince, they will use their iq-smarts to develop the capacity to convince.

Comment author: mattnewport 02 February 2010 09:23:58PM 5 points [-]

Being very intelligent does not imply not being very wrong.

Comment author: MartinB 02 November 2010 02:42:59AM 1 point [-]

You just get to take bigger mistakes than others. From the youtube videos Langan looks like a really bright fellow that has a very broken toolbox, and little correction. Argh!

Comment author: Morendil 02 February 2010 09:49:45PM 6 points [-]

However intelligent he is, he fails to present his ideas so as to gradually build a common ground with lay readers. "If you're so smart, how come you ain't convincing?"

The "intelligent design" references on his Wikipedia bio are enough to turn me away. Can you point us to a well-regarded intellectual who has taken his work seriously and recommends his work? (I've used that sort of bridging tactic at least once, Dennett convincing me to read Julian Jaynes.)

Comment author: Cyan 02 February 2010 10:08:53PM *  5 points [-]

"If you're so smart, how come you ain't convincing?"

"Convincing" has long been a problem for Chris Langan. Malcolm Gladwell relates a story about Langan attending a calculus course in first year undergrad. After the first lecture, he went to offer criticism of the prof's pedagogy. The prof thought he was complaining that the material was too hard; Langan was unable to convey that he had understood the material perfectly for years, and wanted to see better teaching.

Comment author: pjeby 02 November 2010 06:40:25PM 2 points [-]

Yeah, I know what it looks like: meta-physical rubbish.

It is. I got as far as this paragraph of the introduction to his paper before I found a critical flaw:

Of particular interest to natural scientists is the fact that the laws of nature are a language. To some extent, nature is regular; the basic patterns or general aspects of structure in terms of which it is apprehended, whether or not they have been categorically identified, are its “laws”. The existence of these laws is given by the stability of perception.

At this point, he's already begging the question, i.e. presupposing the existence of supernatural entities. These "laws" he's talking about are in his head, not in the world.

In other words, he hasn't even got done presenting what problem he's trying to solve, and he's already got it completely wrong, and so it's doubtful he can get to correct conclusions from such a faulty premise.

Comment author: Tuukka_Virtaperko 05 January 2012 10:04:40PM *  0 points [-]

That's not a critical flaw. In metaphysics, you can't take for granted that the world is not in your head. The only thing you really can do is to find an inconsistency, if you want to prove someone wrong.

Langan has no problems convincing me. His attempt at constructing a reality theory is serious and mature and I think he conducts his business about the way an ordinary person with such aims would. He's not a literary genius like Robert Pirsig, he's just really smart otherwise.

I've never heard anyone to present such criticism of the CTMU that would actually imply understanding of what Langan is trying to do. The CTMU has a mistake. It's that Langan believes (p. 49) the CTMU to satisfy the Law Without Law condition, which states: "Concisely, nothing can be taken as given when it comes to cosmogony." (p. 8)

According to the Mind Equals Reality Principle, the CTMU is comprehensive. This principle "makes the syntax of this theory comprehensive by ensuring that nothing which can be cognitively or perceptually recognized as a part of reality is excluded for want of syntax". (p. 15) But undefinable concepts can neither be proven to exist nor proven not to exist. This means the Mind Equals Reality Principle must be assumed as an axiom. But to do so would violate the Law Without Law condition.

The Metaphysical Autology Principle could be stated as an axiom, which would entail the nonexistence of undefinable concepts. This principle "tautologically renders this syntax closed or self-contained in the definitive, descriptive and interpretational senses". (p. 15) But it would be arbitrary to have such an axiom, and the CTMU would again fail to fulfill Law Without Law.

If that makes the CTMU rubbish, then Russell's Principia Mathematica is also rubbish, because it has a similar problem which was pointed out by Gödel. EDIT: Actually the problem is somewhat different than the one addressed by Gödel.

Langan's paper can be found here EDIT: Fixed link.

Comment author: Tuukka_Virtaperko 10 January 2012 03:28:52PM 0 points [-]

To clarify, I'm not the generic "skeptic" of philosophical thought experiments. I am not at all doubting the existence of the world outside my head. I am just an apparently competent metaphysician in the sense that I require a Wheeler-style reality theory to actually be a Wheeler-style reality theory with respect to not having arbitrary declarations.

Comment author: Risto_Saarelma 10 January 2012 06:34:42PM 4 points [-]

There might not be many people here to who are sufficiently up to speed on philosophical metaphysics to have any idea what a Wheeler-style reality theory, for example, is. My stereotypical notion is that the people at LW have been pretty much ignoring philosophy that isn't grounded in mathematics, physics or cognitive science from Kant onwards, and won't bother with stuff that doesn't seem readable from this viewpoint. The tricky thing that would help would be to somehow translate the philosopher-speak into lesswronger-speak. Unfortunately this'd require some fluency in both.

Comment author: Tuukka_Virtaperko 13 January 2012 01:02:05AM 1 point [-]

It's not like your average "competent metaphysicist" would understand Langan either. He wouldn't possibly even understand Wheeler. Langan's undoing is to have the goals of a metaphysicist and the methods of a computer scientist. He is trying to construct a metaphysical theory which structurally resebles a programming language with dynamic type checking, as opposed to static typing. Now, metaphysicists do not tend to construct such theories, and computer scientists do not tend to be very familiar with metaphysics. Metaphysical theories tend to be deterministic instead of recursive, and have a finite preset amount of states that an object can have. I find the CTMU paper a bit sketchy and missing important content besides having the mistake. If you're interested in the mathematical structure of a recursive metaphysical theory, here's one: http://www.moq.fi/?p=242

Formal RP doesn't require metaphysical background knowledge. The point is that because the theory includes a cycle of emergence, represented by the power set function, any state of the cycle can be defined in relation to other states and prior cycles, and the amount of possible states is infinite. The power set function will generate a staggering amount of information in just a few cycles, though. Set R is supposed to contain sensory input and thus solve the symbol grounding problem.

Comment author: Tuukka_Virtaperko 13 January 2012 01:25:40PM *  0 points [-]

Of course the symbol grounding problem is rather important, so it doesn't really suffice to say that "set R is supposed to contain sensory input". The metaphysical idea of RP is something to the effect of the following:

Let n be 4.

R contains everything that could be used to ground the meaning of symbols.

  • R1 contains sensory perceptions
  • R2 contains biological needs such as eating and sex, and emotions
  • R3 contains social needs such as friendship and respect
  • R4 contains mental needs such as perceptions of symmetry and beauty (the latter is sometimes reducible to the Golden ratio)

N contains relations of purely abstract symbols.

  • N1 contains the elementary abstract entities, such as symbols and their basic operations in a formal system
  • N2 contains functions of symbols
  • N3 contains functions of functions. In mathematics I suppose this would include topology.
  • N4 contains information of the limits of the system, such as completeness or consistency. This information form the basis of what "truth" is like.

Let ℘(T) be the power set of T.

The solving of the symbol grounding problem requires R and N to be connected. Let us assume that ℘(Rn) ⊆ Rn+1. R5 hasn't been defined, though. If we don't assume subsets of R to emerge from each other, we'll have to construct a lot more complicated theories that are more difficult to understand.

This way we can assume there are two ways of connecting R and N. One is to connect them in the same order, and one in the inverse order. The former is set O and the latter is set S.

O set includes the "realistic" theories, which assume the existence of an "objective reality".

  • ℘(R1) ⊆ O1 includes theories regarding sensory perceptions, such as physics.
  • ℘(R2) ⊆ O2 includes theories regarding biological needs, such as the theory of evolution
  • ℘(R3) ⊆ O3 includes theories regarding social affairs, such as anthropology
  • ℘(R4) ⊆ O4 includes theories regarding rational analysis and judgement of the way in which social affairs are conducted

The relationship between O and N:

  • N1 ⊆ O1 means that physical entities are the elementary entities of the objective portion of the theory of reality. Likewise:
  • N2 ⊆ O2
  • N3 ⊆ O3
  • N4 ⊆ O4

S set includes "solipsistic" ideas in which "mind focuses to itself".

  • ℘(R4) ⊆ S1 includes ideas regarding what one believes
  • ℘(R3) ⊆ S2 includes ideas regarding learning, that is, adoption of new beliefs from one's surroundings. Here social matters such as prestige, credibility and persuasiveness affect which beliefs are adopted.
  • ℘(R2) ⊆ S3 includes ideas regarding judgement of ideas. Here, ideas are mostly judged by how they feel. Ie. if a person is revolted by the idea of creationism, they are inclined to reject it even without rational grounds, and if it makes them happy, they are inclined to adopt it.
  • ℘(R1) ⊆ S4 includes ideas regarding the limits of the solipsistic viewpoint. Sensory perceptions of objectively existing physical entities obviously present some kind of a challenge to it.

The relationship between S and N:

  • N4 ⊆ S1 means that beliefs are the elementary entities of the solipsistic portion of the theory of reality. Likewise:
  • N3 ⊆ S2
  • N2 ⊆ S3
  • N1 ⊆ S4

That's the metaphysical portion in a nutshell. I hope someone was interested!

Comment author: gregconen 02 February 2010 10:29:15PM 2 points [-]

But my dilemma is that Chris Langan is the smartest known living man, which makes it really hard for me to shrug the CTMU off as nonsense.

You can't rely too much on intelligence tests, especially in the super-high range. The tester himself admitted that Langan fell outside the design range of the test, so the listed score was an extrapolation. Further, IQ measurements, especially at the extremes and especially on only a single test (and as far as I could tell from the wikipedia article, he was only tested once) measure test-taking ability as much as general intelligence.

Even if he is the most intelligent man alive, intelligence does not automatically mean that you reach the right answer. All evidence points to it being rubbish.

Comment author: ciphergoth 02 February 2010 10:24:23PM 1 point [-]

Chris Langan is the smartest known living man

Many smart people fool themselves in interesting ways thinking about this sort of thing. And of course, when predicting general intelligence based on IQ, remember to account for return to the mean: if there's such a thing as the smartest person in the world by some measure of general intelligence, it's very unlikely it'll be the person with the highest IQ.

Comment author: advael 09 June 2015 05:14:09PM *  0 points [-]

A powerful computer with a bad algorithm or bad information can produce a high volume of bad results that are all internally consistent.

(IQ may not be directly analogous to computing power, but there are a lot of factors that matter more than the author's intelligence when assessing whether a model bears out in reality.)