Okay, so I recently made this joke about future Wikipedia article about Less Wrong:
[article claiming that LW opposes feelings and support neoreaction] will probably be used as a "reliable source" by Wikipedia. Explanations that LW didn't actually "urge its members to think like machines and strip away concern for other people's feelings" will be dismissed as "original research", and people who made such arguments will be banned. Less Wrong will be officially known as a website promoting white supremacism, Roko's Basilisk, and removing female characters from computer games. This Wikipedia article will be quoted by all journals, and your families will be horrified by what kind of a monster you have become. All LW members will be fired from their jobs.
A few days later I actually looked at the Wikipedia article about Less Wrong:
...In July 2010, LessWrong contributor Roko posted a thought experiment to the site in which an otherwise benevolent future AI system tortures simulations of those who did not work to bring the system into existence. This idea came to be known as "Roko's basilisk," based on Roko's idea that merely hearing about the idea
I'd suggest being careful about your approach. If you lose this battle, you may not get another chance. David Gerard most likely has 100 times more experience with wiki battling than you. Essentially, when you make up a strategy, sleep on it, and then try imagining how a person already primed against LW would read your words.
For example, expect that any edit made by anyone associated with LW will be (1) traced back to their identity and LW account, and consequently (2) reverted, as a conflict of interest. And everyone will be like "ugh, these LW guys are trying to manipuate our website", so the next time they are not going to even listen to any of us.
Currently my best idea -- I didn't make any steps yet, just thinking -- is to post a reaction to the article's Talk page, without even touching the article. This would have two advantages: (1) No one can accuse me of being partial, because that's what I would openly disclose first, and because I would plainly say that as a person with a conflict of interest I shouldn't edit my article. Kinda establishing myself as the good guy who follows the Wikipedia rules. (2) A change in article could be simply reverted by David, but he i...
Is any of the following not true?
You are one of the 2 or 3 most vocal critics of LW worldwide, for years, so this is your pet issue, and you are far from impartial.
A lot of what the "reliable sources" write about LW originates from your writing about LW.
You are cherry-picking facts that descibe LW in certain light: For example, you mention that some readers of LW identify as neoreactionaries, but fail to mention that some of them identify as e.g. communists. You keep adding Roko's basilisk as one of the main topics about LW, but remove mentions of e.g. effective altruism, despite the fact that there is at least 100 times more debate on LW about the latter than about the former.
Should we expect more anti-rationalism in the future? I believe that we should, but let me outline what actual observations I think we will make.
Firstly, what do I mean by 'anti-rationality'? I don't mean that in particular people will criticize LessWrong. I mean it in the general sense of skepticism towards science / logical reasoning, skepticism towards technology, and a hostility to rationalistic methods applied to things like policy, politics, economics, education, and things like that.
And there are a few things I think we will observe first (some of...
Front page being reconfigured. For the moment, you can get to a page with the sidebar by going through the "read the sequences" link (not great, and if you can read this, you probably didn't need this message).
Maybe there could be some high-profile positive press for cryonics if it became standard policy to freeze endangered species seeds or DNA for later resurrection
Hello guys, I am currently writing my master's thesis on biases in the investment context. One sub-sample that I am studying is people who are educated about biases in a general context, but not in the investment context. I guess LW is the right place to find some of those so I would be very happy if some of you would participate since people who are aware about biases are hard to come by elsewhere. Also I explicitly ask for activity in the LW community in the survey, so if enough of LWers participate I could analyse them as an individual subsample. Would...
Not the first criticism of the Singularity, and certainly not the last. I found this on reddit, just curious what the response will be here:
"I am taking up a subject at university, called Information Systems Management, and my teacher is a Futurologist! He refrains from even teaching the subject just to talk about technology and how it will solve all of our problems and make us uber-humans in just a decade or two. He has a PhD in A.I. and has already talked to us about nanotechnology getting rid of all diseases, A.I. merging with us, smart cities that...
I think most people on LW also distrust blind techno-optimism, hence the emphasis on existential risks, friendliness, etc.
I've been writing about effective altruism and AI and would be interested in feedback: Effective altruists should work towards human-level AI
What do you think of the idea of 'learning all the major mental models' - as promoted by Charlie Munger and FarnamStreet? These mental models also include cognitive fallacies, one of the major foci of Lesswrong.
I personally think it is a good idea, but it doesn't hurt to check.
The main page lesswrong.com no longer has a link to the Discussion section of the forum, nor a login link. I think these changes are both mistakes.
Suppose there are 100 genes which figure into intelligence, the odds of getting any one being 50%.
The most common result would be for someone to get 50/100 of these genes and have average intelligence.
Some smaller number would get 51 or 49, and a smaller number still would get 52 or 48.
And so on, until at the extremes of the scale, such a small number of people get 0 or 100 of them that no one we've ever heard of or has ever been born has had all 100 of them.
As such, incredible superhuman intelligence would be manifest in a human who just got lucky enough to have all 100 genes. If some or all of these genes could be identified and manipulated in the genetic code, we'd have unprecedented geniuses.
I think a lot more can be said about this, but maybe that's best left to a full post, I'm not sure. Let me know if this was too long / short or poorly worded.
Writing style looks fine. My quibbles would be with the empirical claims/predictions/speculations.
Is the elite really more of a cognitive elite than in the past?
Strenze's 2007 meta-analysis (previously) analyzed how the correlations between IQ and education, IQ and occupational level, and IQ and income changed over time. The first two correlations decreased and the third held level at a modest 0.2.
Will elite worldviews increasingly diverge from the worldviews of those left behind economically?
Maybe, although just as there are forces for divergence, there are forces for convergence. The media can, and do, transmit elite-aligned worldviews just as they transmit elite-opposed worldviews, while elites fund political activity, and even the occasional political movement.
Would increasing inequality really prevent people from noticing economic gains for the poorest?
That notion sounds like hyperbole to me. The media and people's social networks are large, and can discuss many economic issues at once. Even people who spend a good chunk of time discussing inequality discuss gains (or losses) of those with low income or wealth.
For instance, Branko Milanović, whose standing in economics comes from his studies of inequality, is probably best known for his elephant chart, which presents income gains across the global income distribution, down to the 5th percentile. (Which percentile, incidentally, did not see an increase in real income between 1988 and 2008, according to the chart.)
Also, while the Anglosphere's discussed inequality a great deal in the 2010s, that seems to me a vogue produced by the one-two-three punch of the Great Recession, the Occupy movement, and the economist feeding frenzy around Thomas Piketty's book. Before then, I reckon most of the non-economists who drew special attention to economic inequality were left-leaning activists and pundits in particular. That could become the norm once again, and if so, concerns about poverty would likely become more salient to normal people than concerns about inequality.
Will the left continue adopting lots of ideas from postmodernism?
This is going to depend on how we define postmodernism, which is a vexed enough question that I won't dive deeply into it (at least TheAncientGeek and bogus have taken it up). If we just define (however dodgily) postmodernism to be a synonym for anti-rationalism, I'm not sure the left (in the Anglosphere, since that's the place we're presumably really talking about) is discernibly more postmodernist/anti-rationalist than it was during the campus/culture wars of the 1980s/1990s. People tend to point to specific incidents when they talk about this question, rather than try to systematically estimate change over time.
Granted, even if the left isn't adopting any new postmodern/anti-rationalist ideas, the ideas already bouncing around in that political wing might percolate further out and trigger a reaction against rationalism. Compounding the risk of such a reaction is the fact that the right wing can also operate as a conduit for those ideas — look at yer Alex Jones and Jason Reza Jorjani types.
Is politics becoming more a war of worldviews than arguments for & against various beliefs?
Maybe, but evidence is needed to answer the question. (And the dichotomy isn't a hard and fast one; wars of worldviews are, at least in part, made up of skirmishes where arguments are lobbed at specific beliefs.)
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should start on Monday, and end on Sunday.
4. Unflag the two options "Notify me of new top level comments on this article" and "