"Spreading quicker" may not be the best question to ask. The question I'm more interested in is, What is the relationship between speed of communication, and the curve that describes innovation over time?
A good model for this is the degree of genetic isolation in a genetic algorithm. Compare two settings for a GA. One allows mating between any two organisms in the population. Another has many subpopulations, and allows genetic exchange between subpopulations less frequently.
Plot the fitness of the most-fit organism in each population by generation. The first GA, which has fast genetic communication, will initially outstrip the second, but it will plateau at a lower level of fitness, and all the organisms in the population will be identical, and evolution will stop. This is called premature convergence.
The second GA, with restricted genetic communication, will catch up and pass the fitness of the first GA, usually continuing on to a much higher optimum, because it maintains homogenous subpopulations (which allows adaptation) but a diverse global population (which prevents premature convergence).
Think about the development of pop music. As communication technology improved, pop stars like Elvis could be heard, seen, and their records marketed and moved across the entire country more efficiently than marketing local musicians, and replaced live performers with recorded music. On one hand, you could live in Peoria and listen to the most-popular musicians in the country. On the other, by 1990, American pop music had nearly stopped evolving. Rebecca Black could become popular across the nation in a single week, but the amount of innovation or quality she produced was negligible.
Basically, rapid communication gives people too much choice. They choose things comfortably similar to what they know. Isolation is needed to allow new things to gain an audience before they're stomped out by the dominant things.
You need to state your preferences as a function of the long-term trajectory of the entropy of ideas, rather than as any instantaneous quantity.
Great comment. Thanks!
Basically, rapid communication gives people too much choice. They choose things comfortably similar to what they know. Isolation is needed to allow new things to gain an audience before they're stomped out by the dominant things.
This is an interesting idea, reminiscent of, e.g. Lakatos's view of the philosophy of science. He argued that we shouldn't let new theories be discarded too quickly, just because they seem to have some things going against them. Only if their main tenets prove to be unfeasible should we discard them.
I think premature convergence does occur regarding the spread of ideas (memes), too (though it obviously varies). I do think, for instance, that what you describe in music has to a certain extent happened in analytic philosophy. In the early 20th century, several "scientific" approaches to philosophy developed, in, e.g. Cambridge, Vienna and Upsala. Today, the higher pace of communication leads to more convergence.
Basically, rapid communication gives people too much choice. They choose things comfortably similar to what they know. Isolation is needed to allow new things to gain an audience before they're stomped out by the dominant things.
There's a parralel view of innovation here that argues a positive linear relationship with speed of communication and innovation. I first saw it made by Simon Wardley , and I'll try to summarize below.
The speed at which something becomes commodotized is a function of the speed of communication (Wardley uses data from the agricultural/industrial/information revolutions to illustrate this).
New things happen because they're built on the backs of things which have previously been commoditized, this is what we're talking about when we say it's an idea/product/sound who's "time has come"
Therefore, the speed of new things increases with the speed of communication.
It's possible that both views are true, in that there will be less breadth of innovation, but that narrower band will happen faster with a faster speed of communication.
The dynamics may be very different for ideas that primarily have utility, versus ones (like music) that are aesthetic.
What about non-elite groups? (...) they are likely to be heavily influenced by the cognitive elite, especially in the longer run.
I think they are likely to be influenced by whom they consider high-status. If you succeed to make a conspiracy theory website seem like an authoritative source (it must seem as professional as the mainstream media, except that it brings "news you will not hear elsewhere"), that's all you need.
Why would anyone make a professionally looking conspiracy theory website? Aren't "professionality" and "crackpot thinking" kinda opposed in real life? Yeah, the genuine crackpots usually also have low regards for mainstream design or marketing. But a professionally looking conspiracy theory website is a great vehicle for political propaganda. So a foreign government may spend a lot of money and professional work to make a high-status conspiracy website, where the conspiracies are filtered, and only those that are neutral or convenient for the owner are published.
This is not a hypothetical scenario. Russians already have such website with online radio in my country. I see advertisements for it almost everywhere. The content is more or less: "Things your government wants to keep secret from you: Vaccination causes autism. West is the source of all evil. Russia is a paradise. Being a member of European Union is bad for you, because they will kidnap and abuse your children!" And of course, the website is "independent and alternative".
I agree with all of this. The upshot seems to be that its important that those who actually have good ideas achieve high status.
its important that those who actually have good ideas achieve high status.
If we could solve this, the other problems would be solved soon by people with high status. :D
Now members of the cognitive elite are, or so I claim, reasonably good at distinguishing between good and bad ideas.
Not if they're mind-killed by politics, and in a democracy, almost everything is related to politics in one way or another.
A word of warning: Conspiracy-theorists tend to think of themselves as members of "the cognitive elite" — the unusually well-informed, those who think clearly and skeptically, those who have broken free of brainwashing or consensus.
Fun fact: If I want to hear the newest conspiracy theories, the best place to find them is local Mensa.
Yeah, for me the conclusion is "Mensa is not a place for intelligent debate". But still, the people there are filtered by passing independent IQ test, so they would not be wrong at believing they are the cognitive elite, for some definition thereof.
Or, to put it even more succinctly, intelligence does not equate to rationality. That's something that's been hammered home time and time again.
Bad ideas exist in the internet but we shouldn't downplay the number of bad ideas held by non-elite pre-internet folks. Counterpoints to these bad ideas also get propagated more. Seems like an improvement overall, to my intuition.
They will quickly pass this, mostly true information, on to other members of the cognitive elite. This means that the higher pace of information dissemination will translate into a higher pace of learning true ideas, for this group.
But spreading information is not the only reason why people communicate with each other and make statements. In some cases it isn't even the most important one.
I used to follow a local politician on Facebook and read his blog. A few years ago, before he was elected, he used to post interesting ideas. Over the next few years he became more popular and eventually he was elected to a city's council. Nowadays his facebook feed consists mosty of photos of him participating in various events, meeting local business owners, visiting municipal utility companies and so on. The whole feed is photos of people smiling and he basically never posts his ideas anymore, unless they are obvious applause lights. Oh, and he stopped blogging. I don't blame him, because his political persona is based on being liked by anyone and not being polarizing.
Of course, this is an extreme case, but many people have their coworkers and bosses on their facebook. And being polarizing is rarely the best way to advance your career. It seems that only young people care about spreading ideas (good or bad) on their facebook feeds, older people usually prioritize different things when posting on facebook. When you are young it is easy to find new friends, therefore it is not a big deal if some people do not like what you post. When you are older, finding new friends is much more difficult and keeping the old ones becomes much more important. Maybe that's why in my experience older people tend to post different things than young.
I think it does among the cognitive elite, and that this explains the rise of complex but good ideas such as applied rationality and Effective altruism. I'm less sure about other groups.
If I look at our LessWrong Dojo in Berlin and it's usage of CFAR techniques the internet does play a role. However the fact that a few people attended CFAR workshops also played a key role. I think I would know significantly less about CFAR if I wouldn't have meet Valentine in person and chatted with him.
From my perspective the German Pirate Party started off well but never actually went deep and did the required thinking to provide a real political alternative. Online conversations on Twitter fractioned them further. There was a lot more development of new political ideas in '68 than today.
When it comes to newspapers reputation becomes less important while writing sensational articles that get clicks becomes more important. Factual qualities of articles might suffer through that mechanism.
But there room for creating the future in which we want to live. We can actually create better systems.
It certainly leads to ideas that people in specific groups like spreading quicker. Some people like discussions of rationality, and ideas related to rationality will spread quicker among those individuals. It does not necessarily mean those ideas about rationality are good ones. There needs to be some form of feedback built into the system. Each individual seems to be able to choose their own feedback as well, so some people's ability to navigate to these good ideas will be quite good. Other people will be quite good at navigating to bad ideas.
http://blogs.cornell.edu/info2040/2011/11/17/information-cascades-in-social-media-networking/
I think for non-elites it's about the same. It depends on how you conceive "ideas" of course - whether you restrict the term purely to abstractions, or broaden it to include all sorts of algorithms, including the practical.
Non-elites aren't concerned with abstractions as much as elites, they're much more concerned with practical day-to-day matters like raising a family, work, friends, entertainment, etc.
Take for instance DIY videos on Youtube - there are tons of them nowadays, and that's an example of the kind of thing that non-elites (and indeed elites to the extent that they might actually care about DIY) are going to benefit from tremendously. And I think it's going to be natural for a non-elite individual to check out a few (after all it's pretty costless, except in terms of a tiny bit of time) and sift out what seem like the best methods.
I like the idea of verified experts and fact-checking, though in practice I have a much smaller expectation of the mass public obtaining high quality experts.
I feel like my post is going to be way too negative and I'm sick right now with my cognition mildly compromised, but I like the idea of effectively spreading good ideas so I'll make it anyway.
It also makes it much easier for people with shared interests to get in contact.
This is red flag #1 for me. Even with the ability to use the internet to connect to massive networks of people who have different ideas than use people still seem to have the habit of connecting with people who are already like-minded or close to like-minded. If your main social networking is Facebook and a list of subreddits that don't annoy you then you're less likely to be exposed to ideas that you would both consider different from your current mindset and worth considering. If everyone is spending time obtaining new ideas from their family, real-life friends (who have been selected for similar preferences), religious group, news channel that shares the same political bent as them, online-friends (from similar interests), and possibly worst of all a dictionary that shares the same political bent as them then they aren't going to be getting quality new ideas.
Ways to break this cycle could include trojan-horsing people into it, creating multiple presentations/versions of an idea and then quietly exposing people in separate groups to them in ways that are most effective at gaining that one group's following, figuring out ways of presenting information that skips over worldviews (I'm assuming this is hard), or some other method someone cleverer than me on here can think of. If you want to do something like get everyone to read the Sequences then you could create a handful of versions that covertly make bad but initially believable attempts to convince them that their current worldview is correct. (I may go to rationalist hell just for saying that and please no one attempt it without reading comments to this making it clear what a stupid idea it is.)
This will of course lead to a tremendous increase in the amount of false or useless information. But it will also lead to an increase in true and relevant information.
People still have limited time and limited resources for obtaining information from. Figuring out ways to decrease the false and useless information obtained and increasing the true and relevant information at the same time would be handy.
We also run into the problem that experts are wrong on occasion and our best attempts at true and relevant information fail. Correcting widely spread incorrect information should have a higher priority than a tiny blurb in the hard-to-see corner of a newspaper. Are there any blogs for "Things You Were Told By Experts Not Very Long Ago That We Are Now Pretty Damn Sure Are Not Correct"? If so, please tell me.
Now members of the cognitive elite are, or so I claim, reasonably good at distinguishing between good and bad ideas.
They do? I thought we just pushed ideas that match up to our worldviews, are novel, and we think other people will enjoy/give us status for.
On the other hand, [non-elite groups] are likely to be heavily influenced by the cognitive elite, especially in the longer run.
In the long run, yeah I can see that. People are influenced by new political ideologies and you have to be pretty smart to think of a brand new political ideology (to think of a single example). However, I'm not sure the pathways that ideas follow from the cognitive elite to the cognitive lacking are pathways that select for the ideas that you might want to see spread. Ideas that spread are selected for Divine Plaguespreadingness (I want to use the term memetic virility here but I don't know enough about memetics yet and don't want to explain it away with a fancy term).
Finding ways to hack into the normal Divine Plaguespreadingness could be useful here or finding ways to skip the normal pathways that ideas follow for moving from higher IQ to lower IQ populations. This is also true of different populations with similar IQ. Rationality is likely nowhere near hitting critical mass for the high IQ population in the English-speaking world. If we can hit a larger percentage of that population then other populations may follow. HPMOR certainly falls into the criteria of branching into other populations that might benefit from rationality concepts but wouldn't normally be exposed to LW. Are any other groups trying to branch into other populations in similar ways? The online Harry Potter fanfiction community almost certainly can't be the single low-hanging fruit here, guys.
There are a couple of ways of addressing this problem. One is better reputation/karma systems. That would both incentivize people to disseminate true and relevant information, and make it easier to find true and relevant information.
Oh please god no! No no no no no. If you want to go with a handful of experts Metacritic or Rottentomatoes style then okay we can try that out. However, I don't trust democracy to be the effective way of finding good ideas and spreading them out to people. Democracy is the way of getting the majority to be in control and slowly shift from their current position to another close-but-similar-and-widely-agreed-upon position. It just produces groupthink over time (which can be good politically and could help avoid political upheaval but isn't good for what we're wanting it for).
We would still need effective methods of doing double-blind reviews of things with future changes in opinion only available when don't publicly and the past opinion still visible. That, or some other method that we can get people to make reviews while avoiding biases like signaling their in-group, matching positions with people they regard as higher IQ than them, and all of the other mind-killing biases out there.
I would suggest doing rationality tests, but (assuming people on LW are more rational. now that's a worrying thought) we'd need to adjust for the fact that it would select for people who are exposed to LW rationality or read some of the popular sources like SSC. Besides, people who are rational don't have a monopoly on good ideas and some ideas may be more easily obtainable if you are already heavily biased. (̶Y̶o̶u̶ ̶d̶o̶n̶'̶t̶ ̶g̶e̶t̶ ̶l̶o̶a̶d̶s̶ ̶o̶f̶ ̶i̶n̶t̶e̶r̶e̶s̶t̶i̶n̶g̶ ̶a̶t̶t̶e̶m̶p̶t̶e̶d̶ ̶p̶h̶i̶l̶o̶s̶o̶p̶h̶i̶c̶a̶l̶ ̶p̶r̶o̶o̶f̶s̶ ̶o̶f̶ ̶t̶h̶e̶ ̶e̶x̶i̶s̶t̶e̶n̶c̶e̶ ̶o̶f̶ ̶g̶o̶d̶ ̶i̶n̶ ̶a̶ ̶n̶a̶t̶i̶o̶n̶ ̶f̶u̶l̶l̶ ̶o̶f̶ ̶a̶t̶h̶e̶i̶s̶t̶s̶.̶)̶ (edit you would get "loads". see comments) I think I'm just rambling and disagreeing with myself at this point so I'll stop this line of reasoning.
Another method is automatic quality-control of information (e.g. fact-checking). Google have done some work on this, but still, it is in its infancy. It'll be interesting to follow the development in this area in the years to come.
I like fact-checking. Definitely an applause light going on there. If we've gotten the p-value of studies low enough, the vast majority of experts agree, and there are a notable lack of studies disagreeing then that could be very nice.
Has anyone tried setting up a quality-control engine similar to fact-checking but instead works as a meta-analysis generator for scientific journals? It would be nice to see something along the lines of "Vitamin D is good for you (Note: Only 55% of studies agree on this for improving X, Y, and Z. There is not general consensus on the benefits of Vitamin D for anything by bone health. However, Vitamin D has been recognized as safe to take by organization A so take it anyway and see what happens." And then we can also automatically set you up to an internet database that will email you if that information changes later on so that you can update your behaviors accordingly.
Besides, people who are rational don't have a monopoly on good ideas and some ideas may be more easily obtainable if you are already heavily biased. (You don't get loads of interesting attempted philosophical proofs of the existence of god in a nation full of atheists.)
Given that one of the people working in CFAR produced a genuine new proof for it making sense to believe in God that made her convert to Catholicism while being employed in CFAR, I don't see where you get that idea.
In my experience this community is very open to thinking all sorts of contrarian ideas.
That's very interesting and I would be interested in seeing her proof. Was it a new idea that religious people had not thought of and spread before?
I should change my claim. People would be likely to think of "loads" of attempted proofs.
The theory I was trying to state is that certain perspectives or states of mind may be more effective at finding certain ideas than others. Calling "being a contrarian" might not be a good name for a perspective but I'll treat it as such for the moment. Do you think if people at CFAR and LW (our local contrarians) were left to their own devices, they would occupy perspectives to reach every single one of the literally hundreds of proofs for the existence of god compared to people who are highly motivated by belief, social-utility, and dedication to an imagined highly dangerous omnipotent deity? Are there some that would be much harder to obtain or much easier?
I think what I may have been trying to get at was that while LW contrarianism is awesome and a pretty great method for thinking about things, it may not be the best for finding all the possible good ideas out there in idea space.
Now members of the cognitive elite are, or so I claim, reasonably good at distinguishing between good and bad ideas.
What do you mean by "reasonably good"? Hans Rosling makes the point that the cognitive media elite is worse informed than monkeys in the zoo when it comes to the development of Africa: https://www.ted.com/talks/hans_and_ola_rosling_how_not_to_be_ignorant_about_the_world
That's all fine, but you're forgetting about the existence of powerful entities (such as large governments) which can and do affect the information flows on the 'net for their own purposes. They would be very happy with things like a pervasive reputation system or refuting false information because they will control them and use them for their own ends.
It's all for your own good, citizen.
I reckon a good way to identify whether the ideas hear are as good as we think they are are if the established spreaders of good ideas are willing to take them on board.
Has anyone had experience teaching LW affiltiated ideas, non-independent of their LW experience, in universities? Has anyone tried but failed? Any experiences would help others trying to replicate their experience.
I think it does among the cognitive elite, and that this explains the rise of complex but good ideas such as applied rationality and Effective altruism. I'm less sure about other groups.
The Internet increases the speed and the convenience of communication vastly. It also makes it much easier for people with shared interests to get in contact.
This will of course lead to a tremendous increase in the amount of false or useless information. But it will also lead to an increase in true and relevant information.
Now members of the cognitive elite are, or so I claim, reasonably good at distinguishing between good and bad ideas. They do this not the least by finding reliable sources. They will quickly pass this, mostly true information, on to other members of the cognitive elite. This means that the higher pace of information dissemination will translate into a higher pace of learning true ideas, for this group.
What about non-elite groups? I'm not sure. On the one hand, they are, by definition, not as good at distinguishing between good and bad ideas. On the other hand, they are likely to be heavily influenced by the cognitive elite, especially in the longer run.
By and large, I think we have cause for optimism, though: good ideas will continue to spread quickly. How could we make them spread even quicker?The most obvious solution is to increase the reliability of information. Notice that while information technology has made it much more convenient to share information quickly, it hasn't increased the reliability of information.
There are a couple of ways of addressing this problem. One is better reputation/karma systems. That would both incentivize people to disseminate true and relevant information, and make it easier to find true and relevant information. (An alternative, and to my mind interesting, version is reputation systems where the scores aren't produced by users, but rather by verified experts.)
Another method is automatic quality-control of information (e.g. fact-checking). Google have done some work on this, but still, it is in its infancy. It'll be interesting to follow the development in this area in the years to come.