Not directly related to MoR, but whatever. I recently joined a massive HP roleplay forum and what i noticed among the players was a huge deal of optimisation by proxy. Basically the general sentiment is that being sorted into one house means that you have no traits from the others. This makes some sense, because a wizard employer will probably look at the candidates' house affiliation first. I'll need to reread some of the books, to check if it's canon, but in the fans' minds at least, all of Magical Britain is aligning itself to an arbitrary division. It's a bit disturbing, really.
I think movements grow at their margins. It is there that we can make the greatest impact. So perhaps we should focus on recent converts to rationality. Figure out how they converted, what factors were involved, how the transition could have been easier, taking into account their personality etc.
This is what I have been trying to do with the people I introduce rationality to and who are somewhat receptive. It is not only a victory that they began to accept rationality. It was also an opportunity to observe how best to foster more of such conversions.
It is somewhat worrying that these people tend to be geeks/nerds. But given the nature of rationality itself and personality, this shouldn't be too surprising. The success of geek/nerd culture in recent mainstream pop culture is a cause for hope I think, even though some geeks/nerds will argue their culture have been distorted in the process. What matters is that the public are now not averse to geeks/nerds. Such aversion is often the first and strongest impediment to begin even considering topics such as rationality.
It ultimately comes down to social acceptance. We must leverage the phenomenon by focusing on the fringes of the rationality demographic. They have the most power to effect peer pressure to convert further rationalists.
If we use LW as a metric of conversion, then you can consider me a new convert, lured here by the occasional link from the Octagon. This is, of course, a pretty weak metric. I've been interested in rational thinking since the 9th grade, when i went to a debate club and realised that people went there to win arguments, not get to the truth. While i've done my best to keep my actions and words rational in cases that seem detached from my personal life, i think i mostly fail at self-examination.
My personal observations confirm that the geek/nerd social group is the most prone to rationality, but there is a significant buffer layer around the group, that can be influenced and converted.
P.s., It feels good to finally register here. And... Am i the only one who feels a bit odd when using the word "convert" in this context?
View more: Prev
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Now I'm wondering how you could subvert that. I'm imagining something like a Legend of Zelda game which is split into two phases:
A fairly long preliminary phase, where you don't know about saving the world or anything. This should have maybe one or two boss fights and teach you how to play the game well.
A race against time to save the world. Bad stuff starts happening at preset times (plus or minus some randomness), and you've got to hurry and go for the high-probability ideas in order to maximize your chance of not losing. Skip the side-quests and mini-dungeons unless they've got some important items, because Kakariko village will be destroyed in 4-5 hours of game-time. To enhance the sense of urgency, make save-scumming impossible and make it harder to die in order to compensate for the increased difficulty of gameplay. Make sure there are several ways to win in any scenario, so the player doesn't have to rely on trial-and-error to find the one officially blessed way of doing something. And to hell with switch mazes.
I would definitely play this game. It would be intense. And the quest for 100% completion would result in absolutely crazy Let's Play videos.
I'd suggest looking at Pathologic, which implements a world-saving task with a set time limit. You are free to walk around, talk to people and just try to do your regular side-questing, but you need to learn some things and do somethings before the first day is over, you lose. The gameworld is pretty alive in itself - important characters will move around on their daily business, making you ask people for possible directions.
It creates a lifelike situation, where you can't really predict the causal links between your actions and possible progress towards your goal.
I noticed that the decribed fallacy can only be applied to cases where you are able to evaluate with some reliability the possible returns. Let's say you're trying to learn about druidic herbology. You could spend time t1 to find some books on it and time t2 on reading those books for skillset s. Or you could spend T1 > t1 to find an expert in the field and ask for lessons/best books and then spend T2 on studying towards skillset S. The problem is that you can predict t1 and T1, but until either of them is done, you can't evaluate the related extra time needed or the value of the skillsets.