Comment author: username2 19 October 2015 10:42:24AM 10 points [-]

Luke quotes from Superforecasting on his site:

"Doug knows that when people read for pleasure they naturally gravitate to the like-minded. So he created a database containing hundreds of information sources—from the New York Times to obscure blogs—that are tagged by their ideological orientation, subject matter, and geographical origin, then wrote a program that selects what he should read next using criteria that emphasize diversity. Thanks to Doug’s simple invention, he is sure to constantly encounter different perspectives."

wishing to get his hands on this program.

Does anyone know of something similiar, or who this 'Doug' may be? I wonder if this may be as simple as simply asking this man. The book gives 'Doug Lorch' as his full name. Google gives a facebook account as first result, but I have no idea if this is an actual match.

Comment author: signal 04 January 2016 08:59:22PM 0 points [-]

Did anything come from this? Would love to see that, too!

Comment author: [deleted] 01 December 2015 05:12:11PM 1 point [-]

I believe the usual term for this is "politics". This is one classic reference.

There are actually a few others, such as Group Psychology, Marketing, Economics and Mechanism Design.

In general, I see this as a big problem that requires many different frameworks to have an effect.

In response to comment by [deleted] on Open thread, Nov. 30 - Dec. 06, 2015
Comment author: signal 02 December 2015 09:05:16AM 1 point [-]

Can you point out your 3-5 favorite books/frameworks?

Comment author: Lumifer 30 November 2015 05:58:43PM 2 points [-]

how to influence more than one person

I believe the usual term for this is "politics". This is one classic reference.

Comment author: signal 02 December 2015 09:04:16AM 0 points [-]

Thanks Lumifer. The Prince is worth reading. However, tranferring his insights regarding princedoms to how to design and spread memeplexes in the 21st century does have its limits. Any more suggestions?

Comment author: signal 30 November 2015 05:55:06PM 2 points [-]

Can somebody point out text books or other sources that lead to an increased understanding of how to influence more than one person (the books I know address only 1:1, or presentations)? There are books on how to run successful businesses, etc, but is there overarching knowledge that includes successful states, parties, NGOs, religions, other social groups (would also be of interest for how to best spread rationality...). In the Yvain framework: given the Moloch as a taken, what are good resources that describe how to optimally influence the Moloch with many self-interested agents and for example its inherent game-theoretic problems as long as AI is not up to the task?

Comment author: Lumifer 19 November 2015 05:10:28PM 4 points [-]

a perfect world is difficult to achieve ... most also would not have expected for Wikipedia to work out as well as it does

A perfect world is, of course, impossible to achieve (not to mention that what's perfect to you is probably not so for other people) and as to Wikipedia, there are longer lists than yours of its shortcomings and problems. Is it highly useful? Of course. Will it ever get close to perfect? Of course not.

I was fascinated by LW and thought it possible to make great leaps towards some form of truth. I now consider that unwarranted exuberance.

Sure. But this is an observation about your mind, not about LW.

High relevancy to the reader who is an aspiring rationalist.

"Aspiring rationalist" is a content-free expression. It tells me nothing about what you consider "wrong" or "relevant".

The discussion of AI mostly end, where they become interesting.

Heed the typical mind fallacy. Other people are not you. What you find interesting is not necessarily what others find interesting. Your dilemmas or existential issues are not their dilemmas or existential issues.

For example, I don't find the question of "shall we enforce a police state" interesting. The answer is "No", case closed, we're done. Notice that I'm speaking about myself -- you, being a different person, might well be highly interested in extended discussion of the topic.

if you truly think there is an Animal Holocaust (which Singer does), the answer may not be donating $50 to some animal charity.

Yeah, sure, you go join an Animal Liberation Front of some sorts, but what's particularly interesting or rational about it? It's a straightforward consequences of the values you hold.

Comment author: signal 19 November 2015 05:39:25PM 0 points [-]

Heed the typical mind fallacy. Other people are not you. What you find interesting is not necessarily what others find interesting. Your dilemmas or existential issues are not their dilemmas or existential issues. For example, I don't find the question of "shall we enforce a police state" interesting. The answer is "No", case closed, we're done. Notice that I'm speaking about myself -- you, being a different person, might well be highly interested in extended discussion of the topic.

I strongly disagree and think it is unrelated to the typical mind fallacy. Ok, the word "interesting" was too unprecise. However, the argument deserves a deeper look in my opinion. Let me rephrase to: "Discussions of AI sometimes end, where they have serious implications regarding real life." Especially! if you do not enjoy to entertain the thought of a police state and increased surveillance, you should be worried if respected rational essayists come to conclusions that include them as an option. Closing your case when confronted with possible results from a chain of argumentation won't make them disappear. And a police state to stay with the example is either an issue for almost everybody (if it comes to existance) or nobody. Hence, this detached from and not about my personal values.

Comment author: CAE_Jones 19 November 2015 01:58:20PM *  6 points [-]

By third-world comparisons, yes. Otherwise, I doubt it. Provide an example. (Or pledge 50% of your richness to GiveWell)

Unless the third world includes the United States outside of the Bay Area and New England (which, judging by the term "fly-over country", it probably does in lots of minds), then yes, LWers talking about attending CFAR's $3000 workshops and traveling all over the place and how they're already working for a big software giant and talked their bosses into giving them a raise are signs of being toward the higher end of the American Middle Class, if not higher. Just having so many programmers and the occasional psychiatrist is enough to put LW into the "rich even by first world standards" category.

This has come up before. Some LWer who is not rich points out that LWers are on average pretty dang rich, and most everyone goes "surely not! Just abandon everything you have and move to Silicon Valley with the money you don't have and surely you'll get a programming job, and realize how not-rich we are!" *

I am not trying to signal tribal affiliation when I say that LW unintentionally taught me to appreciate the whole "check your privilege" concept.

Having said all that, there are a few people who aren't financially successful STEM lords around here. It's just that they are decidedly not the majority of dominant voices.

* The first and last phrases might be a bit uncharitable, but the reaction is generally disbelief, in spite of the fact that LWers do seem to have thousands of dollars whenever they need them. Just a couple days ago, someone on Facebook was trying to get someone to go with him on a trip to Indiana, so they could split the gas money, but he realized he really needed to spend that money elsewhere. I've had reasonably middle-class people on Facebook trying to come up with someplace to stay, asking for donations for emergencies, saying how they wish they could justify spending money on things far cheaper than a new computer... and all of them are financially and socially way better off than me.

Comment author: signal 19 November 2015 04:05:43PM *  1 point [-]

I conclude from the discussion that the term "rich" is too vague. The following is mine: I should be surprised to find many LWers who don't find themselves in the top percentage of the Global Richlist and who could not afford cryonics if they made it their lives' goal.

Comment author: Romashka 19 November 2015 12:07:06PM *  0 points [-]

promote unwished-for phenomena such as availability heuristic.

Do you mean that the AH is promoted within the community as a whole (-> consensus achieved without weighing all the evidence) or in individual members (-> mindkills in unrelated areas of real life)? This should be testable.

Comment author: signal 19 November 2015 03:50:49PM 0 points [-]

I meant especially in individual members such as described in the point "priorities." Somewhat along the lines that topics in LW are not a representative sample concerning which topics and conclusions are relevant to the individual. In other words: The imaginary guide I write for my children "how to be rational" very much differs from the guide that LW is providing.

Comment author: entirelyuseless 19 November 2015 01:44:48PM 3 points [-]

I mostly agree with what you are saying here, but you should replace the formatting of the post (which you probably wrote with some other editor) with the standard formatting for LW. The discrepant formatting is distracting and adds the signal "outsider", which will bias people against a reception of what you are saying.

Comment author: signal 19 November 2015 02:55:57PM 1 point [-]

Definitely. I am slightly irritated that I missed that. The line spacing and paragraph spacing still seems a bit off compared to other articles. Is there anything I am doing wrong?

Comment author: Gleb_Tsipursky 19 November 2015 01:33:12AM *  2 points [-]

I hear you about the t-shirts and rings, and we are trying to optimize those. Here are two options of t-shirts we think are better: 1 and 2. What do you think?

Comment author: signal 19 November 2015 02:26:28PM 1 point [-]

They are, but I still would not wear them. (And no rings for men unless you are married or have been a champion in basketball or wrestling.)

Let's differentiate two cases in whom we may want to address: 1) Aspiring rationalists: That's the easy case. Take an awesome shirt, sneak in "LW" or "pi" somewhere, and try to fly below the radar of anybody who would not like it. A moebius strip might do the same, a drawing of a cat in a box may work but also be misunderstood. 2) The not-yet aspiring rationalist: I assume, this is the main target group of InIns. I consider this way more difficult, because you have to keep the weirdness points below the gain. And you have to convey interest in a difficult-to-grasp concept on a small area. And nerds are still less "cool" than sex, drugs, and sports. A Space X T-Shirt may do the job (rockets are cool), but LW concepts? I haven't seen a convincing solution, but will ask around. Until then, the best solution to me seems to dress as your tribe expects you to find other ways of spreading the knowledge.

Comment author: gwern 19 November 2015 01:28:39AM 8 points [-]

I was going to say, I have seen articles by Elon Musk and we have been discussing them recently, it's just that for some reason he's written them under a silly pseudonym like 'Wait But Why'...

Comment author: signal 19 November 2015 01:52:24PM 0 points [-]

Fair Enough. Maybe I should take Elon Musk out, he has in WBW found a way to push the value of advertising beyond his the cost of his time spent. If Zuckerberg posts to, I will be fully falsified. To compensate, I introduce typical person X whose personal cost-benefit analysis from posting an article is negative. I still argue that this is the standard.

View more: Next