I'm jealous of all these LW meetups happening in places that I don't live. Is there not a sizable contingent of LW-ers in the DC area?
I think it's pretty well-established here that having accurate beliefs shouldn't actually hurt you.
Not at all. It is well established having accurate beliefs should not hurt a perfect bayesian intelligence. Believing it applied to mere humans would be naive in the extreme.
It's not a good strategy to change your actual beliefs so that you can signal more effectively -- and it probably wouldn't work, anyway.
The fact that we are so damn good at it is evidence to the contrary!
I'm not understanding the disagreement here. I'll grant that imperfect knowledge can be harmful, but is anybody really going to argue that it isn't useful to try to have the most accurate map of the territory?
I think it's pretty well-established here that having accurate beliefs shouldn't actually hurt you.
Hmm: Information Hazards: A Typology of Potential Harms from Knowledge ...?
I haven't read that paper - but thanks for the link, I'll definitely do so - but it seems that that's a separate issue from choosing which beliefs to have based on what it will do for your social status. Still, I would argue that limiting knowledge is only preferable in select cases -- not a good general rule to abide by, partial knowledge of biases and such notwithstanding.
...though it is also worth noting that humans are evolved to be reasonable lie-detectors.
If your actual beliefs don't match your signalled beliefs, others may pick up on that, expose you as a liar, and punish you.
And ideally, you'd take that fact into account in forming your actual beliefs. I think it's pretty well-established here that having accurate beliefs shouldn't actually hurt you. It's not a good strategy to change your actual beliefs so that you can signal more effectively -- and it probably wouldn't work, anyway.
How constructive is:
- Beliefs are for controlling anticipation (Not for being interesting)
...? ...since beliefs, do in fact, serve all kinds of signalling purposes among humans.
It's probably useful at this point to differentiate between actual beliefs and signaled beliefs, particularly because if your beliefs control anticipation (and accurately!), you would know which beliefs you want to signal for social purposes.
And many of the people in this community rub me the wrong way.
Yes, like you, for stealing my post idea! Kidding, obviously.
At the risk of contributing to this community becoming a bit too self-congratulatory, here are some of the more significant concepts that I've grokked from reading LW:
No Universally Compelling Arguments and Ghosts in the Machine. Shamefully, it never even occurred to me to de-anthropomorphize the idea of a mind.
You Provably Can't Trust Yourself and No License To Be Human, along that same theme.
The Luminosity sequence is a bit under-celebrated, I think, with relation to its value. I've found that to be one of the most important things I've read here, and applying those concepts has aided me in improving my life in not-insignificant ways.
Affective Death Spirals! I cannot praise this enough for giving me the skills to recognize this phenomenon and keep myself from engaging in this at the negative end.
Most of all, LW has taught me that being the person that I want to be takes work. To actually effect any amount of change in the world requires understanding the way it really is, whether you're doing science or trying to understand your own personality flaws. Refusing to recognize said flaws doesn't make them go away, reality doesn't care about your ego, etc.
And apparently there was this Bayes guy who had a pretty useful theorem...
But then again, I actually have been pointed to LW from three different sources, so perhaps it was inevitable.
Which three sources? (I'm guessing your brother was one, but I'm curious about the other two.)
The other two were a friend of mine and a productivity blog whose name and url I have since forgotten.
Don't forget: Wikipedia happened.
And this is precisely why I haven't lost all hope for the future. (That, and we've got some really bright people working furiously on reducing x-risk.) On rare occasions, humanity impresses me. I could write sonnets about Wikipedia. And I hate when so-called educators try to imply Wikipedia is low status or somehow making us dumber. It's the kind of conclusion that the Gatekeepers of Knowledge wish was accurate. How can you possibly get access to that kind of information without paying your dues? It's just immoral.
I pose this question: if you had to pick just one essay to introduce someone to LW, which one would you pick and why? I'd like to spread access to the information in the sequences so that it can benefit others as it did me, but I'm at a loss as to where specifically to start. Just tossing a link to the list of sequences is.....overwhelming, to say the least. And I've been perusing them for so long that I can't remember what it's like to read with fresh eyes, and the essays that have the most impact on me now were incomprehensible to me a year ago, I think.
So, I was directed toward this post, in no small part because I am, demographically, a bit unusual for LW. At times, I'm quite optimistic about LW and rationality-in-general's prospects, but then I remember that my being here, and participating, is the product of happenstance. But then again, I actually have been pointed to LW from three different sources, so perhaps it was inevitable.
Ah, but here comes my embarrassing admission:
Most people who are already awesome enough to have passed through all these filters are winning so hard at life (by American standards of success) that they are wayyy too busy to do boring, anti-social & low-prestige tasks like reading online forums in their spare time (which they don’t have much of).
The above is a much more influential factor in my considering how much to participate than I feel happy admitting. I'll openly admit that being rational is not my default mode; I wasn't even targeted as "bright" as a kid. No out-of-ordinary test scores came from me. I have had to really work to get my thoughts to avoid being immediately processed through a Is this the kind of belief that will get me social status? filter. So, I do have this massive fear that being rational is just not natural for me. Nor is my IQ, I suspect, anywhere near the high end of the spectrum here....though that filter for social status has been, I think, obscuring my intelligence for most of my life.
Does socializing on the internet feel low-status to me? Yeah, it does....and had I not basically grown up on the internet, I doubt I'd ever give a community like this a second glance. It's been really tough divorcing society's ideal of what is status-y from what I actually want to do. I love the internet, and I spend a vast amount of time on it, but it still feels low status to me, and so it's not something I advertise. Despite my ability to find more interesting conversation here than I can possibly hope to find in real life!
So, even though I was pointed to LW multiple times independently, I probably would never have actually become an active participant (insofar as I am one) had I not had the personal endorsement of my brother, who is an active member, that this was a very intelligent place. Honestly, I wasn't properly calibrated to identify this place as, well, what it actually is. I don't know what to suggest to get this to be more appealing to people that are like me - that is, smart enough to benefit from the sequences, but not likely to seek it out on their own. The rationality book is probably the best bet.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Can people do Saturday the 18th? 2pm? Bar or coffee?
I could make an appearance. I'm not super familiar with DC so staying pretty close to a metro station would be ideal.