Comment author: dyokomizo 09 June 2015 11:56:54AM 4 points [-]

I'm going.

Comment author: dyokomizo 18 April 2012 12:48:11AM 0 points [-]

I'm going again, it was too fun/interesting to miss.

Comment author: dyokomizo 28 February 2012 01:29:24PM 0 points [-]

Count me in.

In response to comment by dyokomizo on Where are we?
Comment author: Gust 21 December 2011 08:47:04AM 0 points [-]

Same! Are you still around?

In response to comment by Gust on Where are we?
Comment author: dyokomizo 02 January 2012 01:00:06AM 0 points [-]

Around São Paulo, yes. Around LW, not much anymore, I mostly read it via feed reader.

Comment author: AdeleneDawner 03 October 2010 10:53:50PM 5 points [-]

How about a prediction that a particular human will eat bacon instead of jalapeno peppers? (I'm particularly thinking of myself, for whom that's true, and a vegetarian friend, for whom the opposite is true.)

Comment author: dyokomizo 04 October 2010 12:46:01AM -2 points [-]

This model seems to be reducible to "people will eat what they prefer".

A good model would be able to reduce the number of bits to describe a behavior, if the model requires to keep a log (e.g. what particular humans prefer to eat) to predict something, it's not much less complex (i.e. bit encoding) than the behavior.

Comment author: Douglas_Knight 04 October 2010 12:37:16AM *  3 points [-]

I think "vague" is a poor word choice for that concept. "(not) informative" is a technical term with this meaning. There are probably words which are clearer to the layman.

Comment author: dyokomizo 04 October 2010 12:41:50AM 1 point [-]

I agree vague is not a good word choice. Irrelevant (using relevancy as it's used to describe search results) is a better word.

Comment author: [deleted] 03 October 2010 07:34:17PM 5 points [-]

How detailed of a model are you thinking of? It seems like there are at least easy and somewhat trivial predictions we could make e.g. that a human will eat chocolate instead of motor oil.

In response to comment by [deleted] on The Irrationality Game
Comment author: dyokomizo 03 October 2010 07:47:20PM 3 points [-]

I would classify such kinds of predictions as vague, after all they match equally well for every human being in almost any condition.

Comment author: dyokomizo 03 October 2010 01:44:46PM 45 points [-]

There's no way to create a non-vague, predictive, model of human behavior, because most human behavior is (mostly) random reaction to stimuli.

Corollary 1: most models explain after the fact and require both the subject to be aware of the model's predictions and the predictions to be vague and underspecified enough to make astrology seems like spacecraft engineering.

Corollary 2: we'll spend most of our time in drama trying to understand the real reasons or the truth about our/other's behavior even when presented with evidence pointing to the randomness of our actions. After the fact we'll fabricate an elaborate theory to explain everything, including the evidence, but this theory will have no predictive power.

Comment author: taw 08 June 2010 06:09:55PM 4 points [-]

I think my comment was rather vague, and people aren't sure what I meant.

This is all my impressions, as far as I can tell evidence of all that is rather underwhelming; I'm writing this more to explain my thought than to "prove" anything.

It seems to me that people come in different level of smartness. There are some people with all sort of problems that make them incapable of even human normal, but let's ignore them entirely here.

Then, there are normal people who are pretty much incapable of original highly insightful thought, critical thinking, rationality etc. They can usually do OK in normal life, and can even be quite capable in their narrow area of expertise and that's about it. They often make the most basic logic mistakes etc.

Then there are "smart" people who are capable of original insight, and don't get too stupid too often. They're not measuring example the same thing, but IQ tests are capable of distinguishing between those and the normal people reasonably well. With smart people both their top performance and their average performance is a lot better than with average people. In spite of that, all of them very often fail basic rationality for some particular domains they feel too strongly about.

Now I'm conflicted if people who are so much above "smart" as "smart" is above normal really exists. A canonical example of such person would be Feynman - from my limited information he seems to be just so ridiculously smart. Eliezer seems to believe Einstein is like that, but I have even less information about him. You can probably think of a few such other people.

Unfortunately there's a second observation - there's no reason to believe such people existed only in the past, or would have aversion to blogging - so if super-smart people exist, it's fairly certain that some blogs of such people exist. And if such blogs existed, I would expect to have found a few by now.

And yet, every time it seemed to me that someone might just be that smart and I started reading their blog - it turned out very quickly that my estimate of their smartness suffered from rapid regression to the mean. All my super-smart candidates managed to say such horrible things, and be deaf to such obvious arguments that I doubt any of them really qualifies.

So here's an alternative theory. No human alive is much smarter than the "normally smart". Of population of normally smart people, thanks to domain expertise, wit and writing skill, compatibility with my beliefs (or at least happening to avoid my red flags), higher productivity, luck etc. some people simply seem much smarter than that.

I'm not trolling here, but consider Eliezer - I've picked the example because it's well known here. For some time he was exactly such a candidate, however:

  • he is ridiculously good at writing - just look at his fanfics, biasing my perception
  • he manages to avoid many of my red flags, biasing my perception
  • he has cultural background pretty similar to mine, biasing my perception
  • his writing style is very good at avoiding unwarranted certainty - this might seem more rational, but it's really more of a style issue - people like Eliezer and Tyler Cowen who write cautiously just seem far smarter to me than people like Robin Hanson who write in "no disclaimer" style - even though I know very well that Robin is fully aware that contrarian theories he proposes are usually wrong, and there are usually other factors in addition to one he happens to write at the moment - and says that every time he's asked. Style differences bias my perception again.
  • Eliezer usually manages to avoid writing about things I know more than him about, so he usually has advantage of expertise, biasing my perception.
  • So it's safe to guess that however smart Eliezer is, I'm overestimating him - nearly all biases point in identical way.
  • On the other hand he sometimes makes ridiculously wrong statements, like his calculations of cost of cryonics which was blatantly order of magnitude off - I still don't know if this was a massive brain failure (this and other such disqualifying him as a supersmart candidate), or conscious attempt at dark arts (in which case he might still qualify, but he loses points for other reasons).

On the other hand, and this provides some counter-evidence to my theory - let's look at myself. I publish anything on my blog and in comments everywhere that seems to have expected public value higher than zero, and very often I'm in hurry / sleep-depraved, or otherwise far below my top performance. I exaggerate to get the point across very often. I write outside my area of expertise a lot, not uncommonly making severe mistakes. I'm not that good at writing (not to mention that English is not my first language) so things I say may be very unclear.

Unfortunately a normally smart person with my behaviour patterns, and a super-smart person with my behaviour patterns, would probably both fail my super-smartness test.

As you can see, I'm not even terribly convinced that my "super-smart people don't exist" theory is true. I would love to see if other people have good evidence or insight one way or the other.

Another by-the-way: Very often blatantly wrong belief might still be the least-wrong belief given someone's web of beliefs. Often it's easier to believe some minor wrong than to rebuild your whole belief system risking far more damage just to make something small come out correct. So perhaps even my test for being really really wrong is not really all that useful.

In response to comment by taw on Open Thread: June 2010
Comment author: dyokomizo 11 June 2010 10:39:37AM 3 points [-]

It doesn't seem to me that you have an accurate description of what a super-smart person would do/say other than match your beliefs and providing insightful thought. For example, do you expect super-smart people to be proficient in most areas of knowledge or even able to quickly grasp the foundations of different areas through super-abstraction? Would you expect them to be mostly unbiased? Your definition needs to be more objective and predictive, instead of descriptive.

Comment author: taw 03 June 2010 04:56:46AM 7 points [-]

I have a theory: Super-smart people don't exist, it's all due to selection bias.

It's easy to think someone is extremely smart if you've only seen the sample of their most insightful thinking. But every time that happened to me, and I found that such a promising person had a blog or something like that, it universally took very little time to find something terribly brain-hurtful they've written there.

So the null hypothesis is: there's a large population of fairly-smart-but-nothing-special people, who think and publish their thought a lot. Because the best thoughts get distributed, and average and worse thoughts don't, it's very easy from such small biased samples to believe some of them are far smarter than the rest, but their averages are pretty much the same.

(feel free to replace "smart" by "rational", the result is identical)

In response to comment by taw on Open Thread: June 2010
Comment author: dyokomizo 07 June 2010 12:59:33AM 2 points [-]

How would you describe the writing patterns of super-smart people? Similarly, how would meeting/talking/debating them would feel like?

View more: Next