Comment author: Romashka 15 September 2016 06:04:06PM 0 points [-]

Has anyone here had associations/subjective feelings about subjects of study? (Probably in high school, where the range of subjects is wide and students' attitudes depend on the teachers' images, to some degree.) I tended to like algebra, because solving equations and the like reminded me of gradual shifts of attention in yoga-style exercise - flowing, ordered and always seeking the point of balance. Geometry, I took for a much more "masculine" discipline, a form of exercise in endurance, and was not fond of it. Of course, calculus messed things up...:)

Comment author: Sarunas 20 September 2016 02:38:42PM *  0 points [-]

Different subjects do seem to require different thinking style, but, at least for me, they are often quite hard to describe in words. If one has an inclination for one style of thinking, can this inclination manifest in seemingly unrelated areas thus leading to unexpected correlations? This blog posts presents an interesting anecdote.

Comment author: Huluk 26 March 2016 12:55:37AM *  26 points [-]

[Survey Taken Thread]

By ancient tradition, if you take the survey you may comment saying you have done so here, and people will upvote you and you will get karma.

Let's make these comments a reply to this post. That way we continue the tradition, but keep the discussion a bit cleaner.

Comment author: Sarunas 26 March 2016 03:42:26PM 42 points [-]

I have taken the survey.

Comment author: Viliam 17 February 2016 11:58:49AM 2 points [-]

Sigh.

Logic itself has a very gendered and white supremacist history.

These people are clearly unable to distinguish between "the territory" and "the person who talks about the territory".

I had to breathe calmly for a few moments. Okay, I'm not touching this shit on the object level again.

On a meta level, I wonder how much of the missing rationality skills these people never had vs how much they had but lost later when they became politically mindkilled.

Comment author: Sarunas 17 February 2016 05:27:02PM *  3 points [-]

I remember reading SEP on Feminist Epistemology where I got the impression that it models the world in somewhat different way. Of course, this is probably one of those cases where epistemology is tailored to suit political ideas (and they themselves most likely wouldn't disagree) but much less vice versa.

When I (or, I suppose, most LWers) think about how knowledge about the world is obtained the central example is an empirical testing of hypotheses, i.e. situation when I have more than one map of a territory and I have to choose one of them. An archetypal example of this is a scientist testing hypotheses in a laboratory.

On the other hand, feminist epistemology seems to be largely based on Feminist Standpoint Theory which basically models the world as being full of different people who are adversarial to each other and try to promote different maps. It seems to me that it has an assumption that you cannot easily compare accuracies of maps, either because they are hard to check or because they depict different (or even incommensurable) things. The central question in this framework seems to be "Whose map should I choose?", i.e. choice is not between maps, but between mapmakers. Well, there are situations where I would do something that fits this description very well, e.g. if I was trying to decide whether to buy a product which I was not able to put my hands on and all information I had was two reviews, one from the seller and one from an independent reviewer, I would be more likely to trust the latter's judgement.

It seems to me that the first archetypal example is much more generalizable than the second one, and strange claims that were cited in a Pfft's comment is what one gets when one stretches the second example to extreme lengths.

There also exists Feminist Empiricism which seems to be based on idea that since one cannot interpret empirical evidence without a framework, something must be added to an inquiry, and since biases that favour a desirable interpretations is something, it is valid to add them (since this is not a Bayesian inference, this is different from the problem of choice of priors). Since the whole process is deemed to be adversarial (scientists in this model look like prosecutors or defense attorneys), different people inject different biases and then argue that others should stop injecting theirs.

(disclaimer: I have read SEP article some time ago and wrote about these ideas from my memory, it wouldn't be a big surprise if I misrepresented them in some way. In addition to that, there are other obvious sources of potential misrepresentations)

Comment author: bogus 15 January 2016 02:24:00PM *  0 points [-]

A person coming from "social culture" to "nerd culture" may interpret everything as a part of some devious status game.

The social person is right here. Remember 'X is not about Y'?. The difference is that your 'social culture' person is in fact low-to-average status on the relevant hierarchy. Something that's just "harmless social banter" to people who are confident in their social position can easily become a 'status attack', or a 'microaggression' from the POV of someone who happens to be more vulnerable. This is not limited to information-exchange at all, it's a ubiquitous social phenomenon. And this dynamic makes engaging in such status games a useful signal of confidence, so they're quite likely to persist.

Comment author: Sarunas 15 January 2016 06:02:56PM *  2 points [-]

I think that one very important difference between status games and things that might remind people of status game is how long they are expected to stay in people's memory.

For example, I play pub quizzes and often I am the person who is responsible for the answer sheet. Due to strict time limits, discussion must be as quick as possible, therefore in many situations I (or another person who is responsible for the answer sheet) have to reject an idea a person has came up with based on vague heuristic arguments and usually there is no time for long and elaborate explanations. From the outside, it might look like a status related thing, because I had dismissed a person's opinion without a good explanation. However, the key difference is that this does not stay in your memory. After a minute or two, all these things that might seem related to status are already forgotten. Ideally, people should not even come into picture (because paying attention to anything else but the question is a waste of time) - very often I do not even notice who exactly came up with a correct answer. If people tend to forget or not even pay attention whom a credit should be given, also, if they tend to forget cases where their idea was dismissed in favour of another person's idea. In this situation, small slights that happened because discussion should be as quick as possible are not worth remembering, one can be pretty certain that other people will not remember them either. Also, if "everyone knows" they are to be quickly forgotten, they are not very useful in status games either. If something is forgotten it cannot be not forgiven.

Quite different dynamics arise if people have long memories for small slights and "everyone knows" that people have long memories for them. Short memory made them unimportant and useless for status games, but in the second case where they are important and "everyone knows" they are important, they become useful for social games and therefore a greater proportion of them have might have some status related intentionality behind them and not just be random noise.

Similarly, one might play a board game that might things that look like social games, e.g. backstabbing. However, it is expected that when figures go back to the box, all of that is forgotten.

I think that what differentiates information sharing and social games is which of those are more likely to be remembered and which one of them is likely to be quickly forgotten (and whether or not "everyone knows" which is most likely to forgotten or remembered by others). Of course, different people might remember different things about the same situation and they might be mistaken about what other people remember or forget - that's how a culture clash might look like. On the other hand, the same person might tend to remember different things about different situations, therefore people cannot be neatly divided into different cultures, but at the same time frequency of situations of each type seems to be different for different people.

Comment author: Gunnar_Zarncke 03 January 2016 05:54:17PM 2 points [-]

Did I miss the 2015 survey or will there be none?

Comment author: Sarunas 03 January 2016 08:29:57PM 3 points [-]
Comment author: pianoforte611 01 January 2016 07:26:36PM 0 points [-]

Not sure if there is a thread for this, does anyone have access to this article?

“Comparative Efficiency of Informal (Subjective, Impressionistic) and Formal (Mechanical, Algorithmic) Prediction Procedures: The Clinical Statistical Controvery”, Psychology, Public Policy, and Law 2: 293—323

Comment author: Sarunas 02 January 2016 10:36:07PM 1 point [-]
In response to Engineering Religion
Comment author: Sarunas 07 December 2015 10:38:43PM *  1 point [-]

I am really not a sociologist, so someone correct me if I what I'll say is totally wrong, but it seems to me that there are at least two quite distinct types of religion (and a continuum of possibilities in-between), the first one consisting of those religions where "religion" religion (gods, clergy, etc.) is almost one and the same thing as something like civil religion of a community (for example, if you found out that a tribe adds various religious chants to their local "judical process" which otherwise is very similar to a Western judical process you would not hesitate to call it a religious ritual, even though chanting part may be inessential), and another where those are two different things. In my mind the first type roughly corresponds to paganism, and the second one to religions similar to Christianity. I think that religions of the first type may be useful to "grease the wheels" of society (especially not a very sophisticated one), and its leaders may not even be that interested in spreading it, except for personal gains. However, note that this is a vague guess, I would need to read much more about pagan societies to understand if it is at least partially the case. It is unclear to me what does religions of the second type do, because greasing the wheels of society is covered by a civic religion.

Another possible benefit of some types of religion is providing some incentives to get some (although not all) things correct. If you believe that god judges you whether or not you your thinking about the world is correct, you might feel and behave as if you have "skin in the game" and thus you might end up with higher motivation to avoid deceiving yourself (and others) for personal gain [1]. In many contemporary societies personal belief in god seems isolated from most beliefs that have practical consequences. If your society practices trial by combat, and your belief in god makes you willing to fight against a stronger person believing that you will win just because you are in the right, your belief will have practical personal consequences. However, in a contemporary society belief in god is usually harmful only indirectly. Thus, a question is, does feeling that you have "skin in the game of believing the truth about god's creation" lead to enough correct beliefs so as to outweigh having incorrect beliefs about god? I think that it is likely that at least in some cases might do. In addition to that perhaps in some cases religious beliefs might be personally helpful if they are harmless and they displace potentially dangerous beliefs who are new and have not yet shed their most extreme parts.

I think that in both cases religion/belief in god is not strictly a necessity, but in some cases it may (or may not) turn out to be somewhat useful if there are no better alternatives available at that time.

[1] I don't like it. Also, I guess that it is probably not very good for society in the long run.

In response to comment by gjm on LessWrong 2.0
Comment author: Vaniver 04 December 2015 01:36:05AM 6 points [-]

I agree with you that the motivational bits, of wanting to acculturate to LW to be around the cool people, rely on the cool people being here.

The main reason I'm uncertain about the forum as the right model is that I don't see it in many other educational contexts and I think there are weird dynamics around the asymmetry between questioners and answerers and levels of competence/experience. (The cool people want some, but not too much, interaction with not-yet-cool people.) Perhaps the Slack and IRC channels and similar venues deserve some more of my attention as potential solutions here.

Vaniver's other suggestion for something that would serve this need better than a Redditalike is Stack Overflow. That's a better fit, but the SO model works best where what people need is answers to specific questions that have clear-cut answers.

Agreed. This dynamic gets even worse when the problems are psychological. If someone goes to Stack Overflow and posts "hey, this code doesn't do what I expect. What's going wrong?" we can copy the code and run it on our machines and find the issue. If someone goes to Sanity Overflow and posts "hey, I'm akratic. What's going wrong?" we... have a much harder time.


One of the things that comes up every now and then is the idea of rewriting the Sequences, and I think the main goal there would be to make them with as little of Eliezer's personality shining through as possible. (I like his personality well enough, but it's clear that many don't, and a more communal central repository would reduce some of the idiosyncrasy concerns.)

Some think that the Sequences could be significantly shortened, but I suspect that's optimism speaking instead of experience. There are only a handful of sections in the Sequences where Eliezer actually repeats himself, and even then it's likely one of those places where, really, it's worth giving them three slightly different versions of the same thing to make sure they get it.

In response to comment by Vaniver on LessWrong 2.0
Comment author: Sarunas 07 December 2015 09:27:49PM *  0 points [-]

I have read somewhere that all else being equal dialogues attract people's attention better than monologues, at least on television. Perhaps in some cases some ideas (including old sequence posts, especially more controversial ones) could be presented as Socratic dialogues, o perhaps, if a post is being written collaboratively by more than one person, one could write a position and the others (or two) could ask inquisitive questions or try to find holes in his or her argument. You would think that having comments already covers that, and in a sense it is indeed similar to having two waves of comments. However, in this case, the post that is saw by most people has already covered at least a few objections and thus is of relatively higher quality. Secondly, this allows "debate" posts that do not present any clear conclusion and contain only arguments for different positions (where does the controversy lies is often an interesting and informative question). Thirdly, I conjecture that is psychologically more pleasant to be nitpicked by one or two people (who you already know they were explicitly asked to do that) than a lot of commenters at once. You could call this series "Dialogues concerning (human) rationality" or something like that.

Of course, not all posts should be written as dialogues (e.g. some more technical ones might be difficult to structure this way).

In response to comment by VAuroch on LessWrong 2.0
Comment author: Viliam 06 December 2015 11:25:58PM *  6 points [-]

Back when LW was more active, there was much lower math density in posts here.

Maybe because many people are not sure whether their topics are "LW-worthy", but when they do something mathematical they feel comfortable about posting it here. If I write my opinion about something, people will most likely disagree; but if I write an equation and solve it correctly, there is nothing to disagree with.

In response to comment by Viliam on LessWrong 2.0
Comment author: Sarunas 07 December 2015 09:01:31PM 1 point [-]

Assuming this trend exists (I haven't noticed it) I think that in addition to that we also have a fact that reaching higher hanging fruit requires better tools.

In response to comment by Romashka on LessWrong 2.0
Comment author: Lumifer 07 December 2015 03:30:04PM 3 points [-]

If you want to replace all usernames with "Anonymous" and all karma scores with a zero, this can probably be done by a bit of client-side Javascript. You'll need something like a Greasemonkey add-on and the appropriate JS script.

In response to comment by Lumifer on LessWrong 2.0
Comment author: Sarunas 07 December 2015 03:47:19PM 8 points [-]

Checking "Enable Anti-Kibitzer" in Preferences already does that.

View more: Next