Suppose I think, after doing my accounts, that I have a large balance at the bank. And suppose you want to find out whether this belief of mine is "wishful thinking." You can never come to any conclusion by examining my psychological condition. Your only chance of finding out is to sit down and work through the sum yourself.
-- C. S. Lewis
The market isn't particularly efficient. For example, if you bought "No" on all the presidential candidates to win, it would cost $16.16, but would be worth at least $17 for a 5% gain. Of course, after paying the 10% fee on profits and 5% withdrawal fee you would be left with a loss, which is why this opportunity still exists.
Does this affect the accuracy of the market? Serious question; I do not understand the nitty-gritty economics very well.
Just as a little bit of a counterpoint, I loved the 2006-2010 ebook and was never particularly bothered by the length. I read the whole thing at least twice through, I think, and have occasionally used it to look up posts and so on. The format just worked really well for me. This may be because I am an unusually fast reader, or because I was young and had nothing else to do. But it certainly isn't totally useless :P
Working from memory, I believe that when asked about AI in the story, Eliezer said "they say a crackpot is someone who won't change his mind and won't change the subject -- I endeavor to at least change the subject." Obviously this is non-binding, but it still seems odd to me that he would go ahead and do the whole thing that he did with the mirror.
In my opinion the gamma function is by far the stupidest. IME, the off-by-one literally never makes equations clearer; it only obfuscates the relationship between continuous and discrete things (etc.) by adding in an annoying extra step that trips up your intuition. Seems like simple coordination failure.
Yayy! I was having a shitty day, and seeing these results posted lifted my spirits. Thank you for that! Below are my assorted thoughts:
I'm a little disappointed that the correlation between height and P(supernatural)-and-similar didn't hold up this year, because it was really fun trying to come up with explanations for that that weren't prima facie moronic. Maybe that should have been a sign it wasn't a real thing.
The digit ratio thing is indeed delicious. I love that stuff. I'm surprised there wasn't a correlation to sexual orientation, though, since I se...
I remember answering the computer games question and at first feeling like I knew the answer. Then I realized the feeling I was having was that I had a better shot at the question than the average person that I knew, not that I knew the answer with high confidence. Once I mentally counted up all the games that I thought might be it, then considered all the games I probably hadn't even thought of (of which Minecraft was one), I realized I had no idea what the right answer was and put something like 5% confidence in The Sims 3 (which at least is a top ten game). But the point is that I think I almost didn't catch my mistake before it was too late, and this kind of error may be common.
I was confident in my incorrect computer game answer because I had recently read this Wikipedia page List of best-selling video games remembered the answer and unthinkingly assumed that "video games" was the same as "computer games".
In the Bayesian view, you can never really make absolute positive statements about truth anyway. Without a simplicity prior you would need some other kind of distribution. Even for computable theories, I don't think you can ever have a uniform distribution over possible explanations (math people, feel free to correct me on this if I'm wrong!); you could have some kind of perverse non-uniform but non-simplicity-based distribution, I suppose, but I would bet some money that it would perform very badly.
I haven't looked into it much myself, but a couple of people have mentioned RibbonFarm as being something like that.
In terms of Death Note, I've read the first several volumes and can vouch that it's a fun, "cerebral" mystery/thriller, especially if you like people being ludicrously competent at each other, having conversations with multiple levels of hidden meaning, etc. Can't say there's anything super rational about it, but the aesthetic is certainly there.
I occasionally remember to keep pencil + paper by my bed for this reason, so that I can write such things down in the dark without having to get up or turn on a light. Even if the results aren't legible in the usual sense, I've almost always been able to remember what they were about in the morning.
Eliezer is kind of a massive dork who also has an unabashedly high opinion of himself and his ideas. So people see him as a low-status person acting as if he is high-status, which is a pattern that for whatever reason inspires hatred in people. LessWrong people don't feel this way, because to us he is a high-status person acting as if he is high-status, which is perfectly fine.
Also, one thing he does that I think works against him is how defensive he gets when facing criticism. On Reddit, he occasionally will write long rants about how he is an unfair tar...
If many people dismiss LW and MIRI and CFAR for similar reasons, then the only rational response is to identify how that "this is ridiculous" response can be prevented.
I agree with your overall point, but I think that "this is ridiculous" is not really the author's main objection to the LW-sphere; it's clearer in the context of the whole piece, but they're essentially setting up LW/MIRI/CFAR as typical of Silicon Valley culture(!), a collection of mad visionaries (in a good way) whose main problem is elitism; ethereum is then present...
I have it in hard copy, but all attempts so far to scan or photograph it have been foiled. I'm working on it, though; by far the best media piece on Less Wrong I've seen so far.
ETA - To give you an idea: the author personally attended a CFAR workshop and visited MIRI, and upon closer inspection one can make out part of the Map of Bay Area Memespace in one of the otherwise-trite collage illustrations.
Oh, I think we're using the phrase "political movement" in different senses. I meant something more like "group of people who define themselves as a group in terms of a relatively stable platform of shared political beliefs, which are sufficiently different from the political beliefs of any other group or movement". Other examples might be libertarianism, anarcho-primitivism, internet social justice, etc.
I guess this is a non-standard usage, so I'm open to recommendations for a better term.
A reasonable case could be made that this is how NRx came to be.
If this is where NRx came from, then I am strongly reminded of the story of the dog that evolved into a bacterium. An alternative LW-like community that evolved into an aggresive political movement? Either everyone involved was an advanced hyper-genius or something went terribly wrong somewhere along the way. That's not to say that something valuable did not result, but "mission drift" would be a very mild phrase.
Motte-and-Bailey effect (when instead of happening inside a person's head, it happens to a movement - when different people from the same movement occupy motte and bailey (I think that individual and group motte-and-bailey's are quite distinct))
This could just as easily be described, with the opposite connotation, as the movement containing some weakmans*, which makes me think that we need a better way of talking about this phenomenon. 'Palatability spread' or 'presentability spread'? But that isn't quite right. A hybrid term like 'mottemans' and 'baile...
(That said, Richard Feynman is dead and therefore cannot sexually harass any of his current readers.)
A similar argument could be made that a pre-recorded lecture cannot sexually harass someone either (barring of course very creative uses of the video lecture format which we probably would have heard about by now :P ).
Average article quality is almost certainly going down, but the main driving force is probably mass-creation of stub articles about villages in Eastern Europe, plant genera, etc. Of course, editors are probably spread mpre thinly even among important topics as well. A lot of people seem to place the blame for any and all of Wikipedia's problems on bureaucracy, but as a regular editor such criticisms often seem foreign, like they're talking about a totally different website. True, there's a lot of formalities, but they're mostly invisible, and a reasonably ...
This is a good point. I've gotten past my spiral around Eliezer and am working on crawling out of a similar whirlpool around Yvain, and I think that Elizer's egotistical style, even if it is somewhat justified, plays a big part in sending people down that spiral around him. Seeing him being sort of punctured might be useful, even though I'm sure it's awful for him personally.
I mean, you could run correlations with Openness to experience or with age, right? I guess there's probably too small of a sample size to do a lot of interesting analysis with it, but I'm sure one could do some.