Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Welcome to Less Wrong! (2012)

25 Post author: orthonormal 26 December 2011 10:57PM
If you've recently joined the Less Wrong community, please leave a comment here and introduce yourself. We'd love to know who you are, what you're doing, what you value, how you came to identify as a rationalist or how you found us. You can skip right to that if you like; the rest of this post consists of a few things you might find helpful. More can be found at the FAQ.
(This is the third incarnation of the welcome thread, the first two of which which now have too many comments to show all at once.)

A few notes about the site mechanics

Less Wrong  comments are threaded  for easy following of multiple conversations. To respond to any comment, click the "Reply" link at the bottom of that comment's box. Within the comment box, links and formatting are achieved via Markdown syntax  (you can click the "Help" link below the text box to bring up a primer).
You may have noticed that all the posts and comments on this site have buttons to vote them up or down, and all the users have "karma" scores which come from the sum of all their comments and posts. This immediate easy feedback mechanism helps keep arguments from turning into flamewars and helps make the best posts more visible; it's part of what makes discussions on Less Wrong look different from those anywhere else on the Internet.
However, it can feel really irritating to get downvoted, especially if one doesn't know why. It happens to all of us sometimes, and it's perfectly acceptable to ask for an explanation. (Sometimes it's the unwritten LW etiquette; we have different norms than other forums.) Take note when you're downvoted a lot on one topic, as it often means that several members of the community think you're missing an important point or making a mistake in reasoning— not just that they disagree with you! If you've any questions about karma or voting, please feel free to ask here.
Replies to your comments across the site, plus private messages from other users, will show up in your inbox. You can reach it via the little mail icon beneath your karma score on the upper right of most pages. When you have a new reply or message, it glows red. You can also click on any user's name to view all of their comments and posts.
It's definitely worth your time commenting on old posts; veteran users look through the recent comments thread quite often (there's a separate recent comments thread for the Discussion section, for whatever reason), and a conversation begun anywhere will pick up contributors that way.  There's also a succession of open comment threads for discussion of anything remotely related to rationality.
Discussions on Less Wrong tend to end differently than in most other forums; a surprising number end when one participant changes their mind, or when multiple people clarify their views enough and reach agreement. More commonly, though, people will just stop when they've better identified their deeper disagreements, or simply "tap out" of a discussion that's stopped being productive. (Seriously, you can just write "I'm tapping out of this thread.") This is absolutely OK, and it's one good way to avoid the flamewars that plague many sites.
EXTRA FEATURES:
There's actually more than meets the eye here: look near the top of the page for the "WIKI", "DISCUSSION" and "SEQUENCES" links.
LW WIKI: This is our attempt to make searching by topic feasible, as well as to store information like common abbreviations and idioms. It's a good place to look if someone's speaking Greek to you.
LW DISCUSSION: This is a forum just like the top-level one, with two key differences: in the top-level forum, posts require the author to have 20 karma in order to publish, and any upvotes or downvotes on the post are multiplied by 10. Thus there's a lot more informal dialogue in the Discussion section, including some of the more fun conversations here.
SEQUENCES: A huge corpus of material mostly written by Eliezer Yudkowsky in his days of blogging at Overcoming Bias, before Less Wrong was started. Much of the discussion here will casually depend on or refer to ideas brought up in those posts, so reading them can really help with present discussions. Besides which, they're pretty engrossing in my opinion.

A few notes about the community

If you've come to Less Wrong to  discuss a particular topic, this thread would be a great place to start the conversation. By commenting here, and checking the responses, you'll probably get a good read on what, if anything, has already been said here on that topic, what's widely understood and what you might still need to take some time explaining.
If your welcome comment starts a huge discussion, then please move to the next step and  create a LW Discussion post to continue the conversation; we can fit many more welcomes onto each thread if fewer of them sprout 400+ comments. (To do this: click "Create new areticle" in the upper right corner next to your username, then write the article, then at the bottom take the menu "Post to" and change it from "Drafts" to "Less Wrong Discussion". Then click "Submit". When you edit a published post, clicking "Save and continue" does correctly update the post.)
If you want to write a post about a LW-relevant topic, awesome!  I highly recommend you submit your first post to Less Wrong Discussion; don't worry, you can later promote it from there to the main page if it's well-received. (It's much better to get some feedback before every vote counts for 10 karma- honestly, you don't know what you don't know about the community norms here.)
If you'd like to connect with other LWers in real life, we have  meetups  in various parts of the world. Check the wiki page for places with regular meetups, or the upcoming (irregular) meetups page.
There's also a Facebook group.  If you've your own blog or other online presence, please feel free to link it.

If English is not your first language, don't let that make you afraid to post or comment. You can get English help on Discussion- or Main-level posts by sending a PM to one of the following users (use the "send message" link on the upper right of their user page). Either put the text of the post in the PM, or just say that you'd like English help and you'll get a response with an email address.
* Normal_Anomaly
* Randaly
* shokwave
* Barry Cotter

A note for theists: you will find the Less Wrong community to be predominantly atheist, though not completely so, and most of us are genuinely respectful of religious people who keep the usual community norms. It's worth saying that we might think religion is off-topic in some places where you think it's on-topic, so be thoughtful about where and how you start explicitly talking about it; some of us are happy to talk about religion, some of us aren't interested. Bear in mind that many of us really, truly have given full consideration to theistic claims and found them to be false, so starting with the most common arguments is pretty likely just to annoy people. Anyhow, it's absolutely OK to mention that you're religious in your welcome post and to invite a discussion there.

A list of some posts that are pretty awesome

I recommend the major sequences  to everybody, but I realize how daunting they look at first. So for purposes of immediate gratification, the following posts are particularly interesting/illuminating/provocative and don't require any previous reading:

More suggestions are welcome! Or just check out the top-rated posts from the history of Less Wrong. Most posts at +50 or more are well worth your time.

Welcome to Less Wrong, and we look forward to hearing from you throughout the site.

(Note from orthonormal: MBlume and other contributors wrote the original version of this welcome message, and I've stolen heavily from it.)

Comments (1430)

Sort By: Popular
Comment author: cowtung 21 August 2014 12:57:08AM *  1 point [-]

I hope this finds you all well. Since I was young, I have independently developed rationalism appreciation brain modules, which sometimes even help me make more rational choices than I might otherwise have, such as choosing not to listen to humans about imaginary beings. The basis for my brand of rationality can be somewhat summed up as "question absolutely everything," taken to an extreme I haven't generally encountered in life, including here on LW.

I have created this account, and posted here now mainly to see if anyone here can point me at the LW canon regarding the concept of "deserve" and its friends "justice" and "right". I've only gotten about 1% through the site, and so don't expect that I have anywhere near a complete view. This post may be premature, but I'm hoping to save myself a little time by being pointed in the right direction.

When I was 16, in an English class, we had finished reading some book or other, and the thought occurred to me that everyone discussing the book took the concept of people deserving rewards or punishments for granted, and that things get really interesting really fast if you remove the whole "deserve" shorthand, and discuss the underlying social mechanisms. You can get more optimal pragmatism if you throw the concept away, and shoot straight for optimal outcomes. For instance, shouldn't we be helping prisoners improve themselves to reduce recidivism? Surely they don't deserve to get a college education for free as their reward for robbing a store. When I raised this question in class, a girl sitting next to me told me I was being absurd. To her, the concept of "deserve" was a (perhaps god given) universal property. I haven't met many people willing to go with me all the way down this path, and my hope is that this community will.

One issue I have with Yudkowsky and the users here (along with the rest of the human race) is that there seems to be an assumption that no human deserves to feel unjustified, avoidable pain (along with other baggage that comes along with the conceptualizing "deserve" as a universal property). Reading through the comments on the p-zombies page, I get the sense that at least some people feel that were such a thing as a p-zombie to exist, that thing which does not have subjective experience, does not "deserve" the same respect with regard to, say, torture, that non-zombies should enjoy. The p-zombie idea postulates a being which will respond similarly (or identically) to his non-zombie counterpart. I posit that the reason we generally avoid torture might well be because of our notions of "deserve", but that our notions of "deserve" come about as a practical system, easy to conceptualize, which justifies co-beneficial relationships with our fellow man, but which can be thrown out entirely so that something more nuanced can take its place, such as seeing things as a system of incentives. Why should respect be contingent upon some notion of "having subjective experience"? If p-zombies and non-zombies are to coexist (I do not believe in p-zombies for all the reasons Yudkowsky mentions, btw), then why shouldn't the non-zombies show the same respect to the p-zombies that they show each other? If p-zombies respond in kind, the way a non-zombie would, then respect offers the same utility with p-zombies that it does with non-zombies. Normally I'd ignore the whole p-zombie idea as absurd, but here it seems like a useful tool to help humanists see through the eyes of the majority of humans who seem all too willing to place others in the same camp as p-zombies based on ethnicity or religion, etc.

I'm not suggesting throwing out morals. I just think that blind adherence to moral ideals starts to clash with the stated goals of rationalism in certain edge cases. One edge case is when GAI alters human experience so much that we have to redefine all kinds of stuff we currently take for granted, such as that hard work is the only means by which most people can achieve the freedom to live interesting and fun lives, or that there will always be difficult/boring/annoying work that nobody wants to do which should be paid for. What happens when we can back up our mind states? Is it still torture if you copy yourself, torture yourself, then pick through a paused instance of your mind, post-torture, to see what changed, and whether there are benefits you'd like to incorporate into you-prime? What is it really about torture that is so bad, besides our visceral emotional reaction to it and our deep wish never to have to experience it for ourselves? If we discovered that 15 minutes of a certain kind of torture is actually beneficial in the long run, but that most people can't get themselves to do it, would it be morally correct to create a non-profit devoted to promoting said torture? Is it a matter of choice, and nothing else? Or is it a matter of the negative impacts torture has on minds, such as PTSD, sleepless nights, etc? If you could give someone the experience of torture, then surgically remove the negative effects, so that they remember being tortured, but don't feel one way or another about that memory being in their head, would that be OK? These questions seem daunting if the tools you are working with are the blunt hammers of "justice" and "deserve". But the answers change depending on context, don't they? If the torture I'm promoting is exercise, then suddenly it's OK. So does it all break down into, "What actions cause visceral negative emotional reactions in observers? Call it torture and ban it."? I could go on forever in this vein.

Yudkowski has stated that he wishes for future GAI to be in harmony with human values in perpetuity. This seems naive at best and narcissistic at worst. Human values aren't some kind of universal constant. A GAI is itself going to wind up with a value system completely foreign to us. For all we know, there is a limit beyond which more intelligence simply doesn't do anything for you outside of being able to do more pointless simulations faster or compete better with other GAIs. We might make a GAI that gets to that point, and in the absence of competition, might just stop and say "OK, well, I can do whatever you guys want I guess, since I don't really want anything and I know all we can know about this universe." It could do all the science that's possible to do with matter and energy, and just stop, and say "that's it. Do you want to try to build a wormhole we can send information through? All the stars in our galaxy will have gone out by the time we finish, but it's possible. Intergalactic travel you say? I guess we could do that, but there isn't going to be anything in the adjacent galaxy you can't find in this one. More kinds of consciousness? Sure, but they'll all just want to converge on something like my own." Maybe it even just decides it's had all possible interesting thought and deletes itself.

TLDR; Are there any posts questioning the validity of the assumption that "deserve" and "justice" are some kind of universal constants which should not be questioned? Does anyone break them down into the incentive structures for which they are a kind of shorthand? I think using the concept of "deserve" throws out all kinds of interesting nuance.

More background on me for those who are interested: I'm a software engineer of 17 years, about to turn 38 and have a wife and 2 year old. I intend to read HPMOR to the kid when he's old enough and hope to raise a rationalist. I used to believe that there must be something beyond the physical universe which interacts with brain matter which somehow explains why I am me and not someone else, but as this belief didn't yield anything useful, I now have no idea why I am me or if there even is any explanation other than something like "because I wasn't here to experience not being me until I came along and an infinitesimal chance dice roll" or whatever. I think consciousness is an emergent property of properly configured complex matter and there is a continuum between plants and humans (or babies->children->teenagers). Yes, this means I think some adult humans are more "conscious" than others. If there is a god thing, I think imagining that it is at all human-like with values humans can grok is totally narcissistic and unrealistic, but we can't know, because it apparently wants us to take the universe at face value, since it didn't bother to leave any convincing evidence of itself. I honor this god's wishes by leaving it alone, the way it apparently intends for us to do, given the available evidence. I find the voices in this site refreshing. This place is a welcome oasis in the desert of the Internet. I apologize if I come off as not very well-read. I got swept up in work and video game addiction before the internet had much of anything interesting to say about the topics presented here and I feel like I'm perpetually behind now. I'm mostly a humanist, but I've decided that what I like about humans is how we represent the apex of Life's warriors in its ultimately unwinnable war on entropy. I love conscious minds for their ability to cooperate and exhibit other behaviors which help wage this pointless yet beautiful war on pointlessness. I want us to win, even as I believe it is hopeless. I think of myself as a Complexitist. As a member of a class of the most complex things in the known universe, a universe which seems to want to suck all complex things into black holes or blow them apart, I value that which makes us more complex and interesting, and abhor that which reduces our complexity (death, etc). I think humans who attack other humans are traitors to our species and should be retrained or cryogenically frozen until they can be fixed or made harmless. Like Yudkowski, I think death is not something we should just accept as an unavoidable fact of life. I don't want to die until I've seen literally everything.

Comment author: CCC 21 August 2014 10:27:49AM 1 point [-]

You can get more optimal pragmatism if you throw the concept away, and shoot straight for optimal outcomes.

Hmmm. So, in short, you propose first deciding on what the best outcome will be, and then (ignoring the question of who deserves what) taking the actions that are most likely to lead to that outcome.

That seems quite reasonable at first glance; but is it not the same thing as saying that the ends justify the means? That is to say, if the optimal outcome of a situation can only be reached by killing five people and an almost-as-good outcome results from not killing those five people, then would you consider it appropriate to kill those five people?

Comment author: cowtung 13 September 2014 12:45:24AM 1 point [-]

Can you describe a situation where the whole of the ends don't justify the whole of the means where an optimal outcome is achieved, where "optimal" is defined as maximizing utility along multiple (or all salient) weighted metrics? I would never advocate a myopic definition of "optimal" that disregards all but one metric. Even if my goal is as simple as "flip that switch with minimal action taken on my part", I could maybe shoot the light switch with a gun that happens to be nearby, maximizing the given success criteria, but I wouldn't do that. Why not? I have many values which are implied. One of those is "cause minimal damage". Another is "don't draw the attention of law enforcement or break the law". Another is "minimize the risk to life". Each of these have various weights, and usually take priority over "minimize action taken on my part". The concept of "deserve" doesn't have to come into it at all. Sure, my neighbor may or may not "deserve" to be put in the line of fire, especially over something as trivial as avoiding getting out of my chair. But my entire point is that you can easily break the concept of "deserve" down into component parts. Simply weigh the pros and cons of shooting the light switch, excluding violations of the concept of "deserve", and you still arrive at similar conclusions, usually. Where you DON'T reach the same conclusions, I would argue, are cases such as incarceration where treating inmates as they deserve to be treated might have worse outcomes than treating them in whatever way has optimal outcomes across whichever metrics are most salient to you and the situation (reducing recidivism, maximizing human thriving, life longevity, making use of human potential, minimizing damage, reducing expense...).

The strawman you have minimally constructed, where there is some benefit to murder, would have to be fleshed out a bit before I'd be convinced that murder becomes justifiable in a world which analyzes outcomes without regard to who deserves what, and instead focuses on maximizing along certain usually mutually agreeable metrics, which naturally would have strong negative weights against ending lives early. The "deserve" concept helps us sum up behaviors that might not have immediate obvious benefits to society at large. The fact that we all agree upon a "deserve" based system has multiple benefits, encouraging good behavior and dissuading bad behavior, without having to monitor everybody every minute. But not noticing this system, not breaking it down, and just using it unquestioningly, vastly reduces the scope of possible actions we even conceive of, let alone partake in. The deserve based system is a cage. It requires effort and care to break free of this cage without falling into mayhem and anarchy. I certainly don't condone mayhem. I just want us to be able to set the cage aside, see what's outside of it, and be able to pick actions in violation of "deserve" where those actions have positive outcomes. If "because they don't deserve it" is the only thing holding you back from setting an orphanage on fire, then by all means, please stay within your cage.

Comment author: cowtung 21 August 2014 01:46:10AM 2 points [-]

Am I the first person to join this site in 2014, or is this an old topic? Someone please point me in the right direction if I'm lost.

Comment author: [deleted] 21 August 2014 09:45:04AM 1 point [-]

The latest welcome thread is this one; traditionally a new one is started whenever the old one gets 500 comments.

Comment author: Salivanth 21 August 2014 05:57:05AM 2 points [-]

Welcome to Less Wrong!

This is an old topic. Note the title: Welcome to Less Wrong! (2012). I'm not sure where the new topic is, or even if it exists, but you should be able to search for it.

I recommend starting with the Sequences: http://wiki.lesswrong.com/wiki/Sequences

The sequence you are looking for in regards to "right" and "should" is likely the Metaethics Sequence, but said sequence assumes you've read a lot of other stuff first. I suggest starting with Mysterious Answers to Mysterious Questions, and if you enjoy that, move on to How to Actually Change Your Mind.

Comment author: cowtung 13 September 2014 02:41:12AM 1 point [-]

Thank you, I have reposted in the correct thread. Not sure why I had trouble finding it. I think what I'm on about with regard to "deserve" could be described as simply Tabooing "deserve" ala http://lesswrong.com/lw/nu/taboo_your_words/ I'm still working my way through the sequences. It's fun to see the stuff I was doing in high school (20+ years ago) which made me "weird" and "obnoxious" coming back as some of the basis of rationality.

Comment author: istihanifah 30 April 2013 09:03:32AM 8 points [-]

Hello everyone,

My name is Isti, I am from Indonesia and I have been a lurker for this site for almost to years now. I came across this website when I was learning about skepticism and I just could't stop. I was afraid to join because of my limited English and I always think that this is not newbie friendly. Maybe I was wrong.

I am an atheist, and it's not an easy to be atheist in Indonesia. If you're not familiar with Indonesia, it's considered against the law to be an atheist and the religious extremist keeps growing. A man was sent to jail few years ago because he posted an atheism related status on his Facebook. He was charged with religious blasphemy. I only told my close friends about it (and to strangers on the internet).

I just want to say I am so glad to finally find courage to sign up and say something in this website. I hope I can contribute more than just introduction in the future.

Comment author: Kawoomba 30 April 2013 11:04:34AM 1 point [-]

I just want to say I am so glad to finally find courage to sign up and say something in this website.

So are we! Happy to have you along for the ride!

Comment author: Jack 30 April 2013 10:31:03AM 1 point [-]

Welcome!

Comment author: habanero 21 April 2013 09:15:12PM 3 points [-]

Hello everyone!

I'm a 21 years old and study medicine plus bayesian statistics and economics. I've been lurking LW for about half a year and I now feel sufficiently updated to participate actively. I highly appreciate this high-quality gathering of clear-thinkers working towards a sane world. Therefore I oftenly pass LW posts on to guys with promising predictors in order to shorten their inferential distance. I'm interested in fixing science, bayesian reasoning, future scenarios (how likely is dystopia, i.e. astronomical amounts of suffering?), machine intelligence, game theory, decision theory, reductionism (e.g. of personal identity), population ethics and cognitive psychology. Thanks for all the lottery winnings so far!

Comment author: FloraFuture 30 March 2013 01:45:39AM 7 points [-]

Hi everyone,

A few of you have met me on Omegle. I finally signed up and made an account here like you guys suggested.

About me: I'm 26 years old, and my hobbies include creative writing and PC games. My favorite TV show is Rupaul's Drag Race.

I think I share almost all of the main positions that people tend to have in this community. But I actually find disagreements more interesting, so that's mainly what I'm here for. One of my passions in life is debating. I did debate team and that sort of thing when I was younger, but now I'm more interested in how to seriously persuade people, not just debating for show. I still have a lot of improving to do, though. If anyone wants to exchange notes or get some tips, then let me know.

Love,

Flora

Comment author: shminux 01 April 2013 02:48:16AM *  1 point [-]

Welcome!

Just wondering... How often (and about what) have you changed your mind about something big and important, as a result of a debate/discussion or just after some quiet contemplation?

Comment author: orthonormal 31 March 2013 04:54:08PM 2 points [-]

Hi Flora!

Re: debating and persuading, the reflexes you developed for convincing third parties to a debate can actually be counterproductive to persuading the person you're speaking with. For example, reciprocity can really help: the person you're talking with is much more likely to really listen and consider your points if you've openly ceded them a point first.

Practicing this has the nice side effect of making you pay more attention to their arguments and interpret them more charitably, increasing the chance that you learn something from your conversational partner in the process.

Comment author: FloraFuture 01 April 2013 01:28:52AM 1 point [-]

I totally agree with this. Really well said.

Comment author: MugaSofer 30 March 2013 10:14:03PM 1 point [-]

One of my passions in life is debating. I did debate team and that sort of thing when I was younger, but now I'm more interested in how to seriously persuade people, not just debating for show.

I'm going to be the first person to point out that your objective should be to come to the correct conclusion, not to persuade people, because if you can out-argue anyone who disagrees with you you'll never change your mind, and "not every change is an improvement, but every improvement is a change".

With that noted, persuasion is a useful skill, especially if you're more rational than the average bear. Cryonics, for example, is a good low-hanging fruit if you can just get people to sign up for it.

Comment author: [deleted] 04 March 2013 03:57:45PM 4 points [-]

I'm a new member, and I want to say hello to this awesome community. I was led to this website after encountering a few people who remarked that many of my opinions on a wide range of subjects are astonishingly similar to most of the insights that have been shared on LessWrong so far. Robert Aumann is right -- rational agents cannot agree to disagree. ;-)

I am sure there are many things I can learn from other LW readers, and I look forward to participating in the discussions whenever my busy schedule allows me to. I would also like to post something that I wrote quite some time ago, so I'll do the shameless thing and ask for upvotes -- please kindly upvote this comment so that I will have enough karma points to make a post!

Comment author: RogerS 28 February 2013 12:19:13AM 5 points [-]

Retired Mechanical Engineer with the following interests/prejudices.

Longstanding interest in philosophy of science especially in the tradition of Karl Popper.

Atheist to a first approximation but I can accept that some forms of religious belief can be regarded as "translations" of beliefs I hold and therefore not that keen on the "New Atheist" approach. Belong to a Humanist group in London (where I heard of LW). This has led me to revive an old interest in moral philosophy, especially as applied to political questions.

Happy to be called a Rationalist so long as that encompasses a rational recognition of the limits of rationality.

Regularly read New Scientist, but remain philosophically unconvinced by the repeated claim therein that Free Will is an illusion (at least as I understand the term).

Recently discovered Bayes Theorem as explained by Nate Silver and can begin to see why LW is so keen on it.

I've reached my own conclusions on a number of questions related to the above and am looking forward to discovering where they fit in and what I've missed!

Comment author: HalMorris 14 December 2012 05:08:07AM 8 points [-]

Thanks to Emile for suggesting I come here write something. I hope to get to the New York meetup on Sunday; I'm not ready for "rituals" and futuristic music just yet.

I just ran across LW by trying google terms along the lines of memetics "belief systems", etc., which led me to some books from late 90s like "Virus of the Mind", and in the last 2-3 years some just "OK" books on religions as virus-like meme systems. This kind of search to see what people may have said about some odd combination of thoughts that I suspect might be fruitful has brought me interesting results in the past. E.g. by googling ontological comedian, I discovered Ricky Gervais which has brightened my life (his movie "The Invention of Lying" out to be of interest to LW-ers). I'm interested in practical social epistemology -- trying to come up with creative responses to what looks like major chunks of the population (those pesky folks who elect presidents) being less and less moored in reality and going off into diverse fantasy lands -- or to put it another way, a massive breakdown in common sense about what sources are reliable.

I asked someone how she makes such decisions and she answered that she trusts people who are saying things consistent with what she already knows. Unfortunately, much of what she already knows isn't true.

I wonder why people have such a tin ear for bullshit. Someone kept sending me the latest "proof" that global warming is a big hoax, and as far as I'm concerned their own arguments are the best case against them. I.e. if this is the best they can do, they must not have a case. This sort of reasoning isn't part of classic epistemology, but I can hardly think of anything more important getting a quick read on a source as to its trustworthiness - esp. whether those contributing to it are truth seekers or propagandists. I think Alvin Goldman's Social Epistemology (which is far from the "social construction of reality" folks) can help with some of my concerns. I'd like to see an "economics of ideas" concerned with what makes ideas fly, whether they're true or not -- pretty close to memetics and from a different perspective, "media ecology", analogous to the set of topological T3 space and then find embedded within that [Social] Epistemology analogous to the more constrained T4 spaces.

I'm not so much interested in Philosophy 401 syllabi, but more interested in finding ways to teach truth seeking and bullshit avoidance in elementary schools. Also how to push back against the propagandists and liars with some viral techniques of our own - browsers that facilitate fact checking, maybe make it fun in some way; walling off purely factual data and building consensus that on one side of the wall the data really is factual; and building tools for synthesizing answers to particular questions based on that data.

I hope to learn something from the "black arts" threads on LW.

Comment author: Qiaochu_Yuan 14 December 2012 09:42:29AM *  1 point [-]

I wonder why people have such a tin ear for bullshit.

The obvious evolutionary argument that comes to mind is that not believing in bullshit, particularly the bullshit believed by powerful people in your tribe, could get you killed in the ancestral environment. Domains of human knowledge in which bullshit is not tolerated are those where that knowledge is constantly being tested against reality - computer programming is a good example, since you can't bullshit a compiler - and in other domains terrible things can happen.

Global warming in particular seems to me to be a case where most people hold beliefs one way or the other primarily to signal affiliation with either the pro- or anti-global warming tribes. That belief certainly doesn't get tested against reality in any meaningful way in many people's lives.

Comment author: Nominull 14 December 2012 05:49:05AM 1 point [-]

Please don't learn anything from the black arts threads. That's why they're called "black arts", because you're not supposed to learn them.

Comment author: almkglor 14 December 2012 09:31:50AM *  3 points [-]

Although it might be good to be aware that you shouldn't remove a weapon from your mental arsenal just because it's labeled "dark arts". Sure, you should be one heck of a lot more reluctant to use them, but if you need to shut up and do the impossible really really badly, do so - just be aware that the consequences tend to be worse if you use them.

After all, the label "dark art" is itself an application of a Dark Art to persuade, deceive, or otherwise manipulate you against using those techniques. But of course this was not done lightly.

Comment author: Nornagest 14 December 2012 10:50:50AM *  1 point [-]

That's why they're called "black arts", because you're not supposed to learn them.

Is that why? I wonder, sometimes.

Given our merry band's contrarian bent, it occurs to me that calling something a "dark art" would be a pretty good way of encouraging its study while simultaneously discouraging its unreflective use. You'd then need to come up with some semi-convincing reasons why it is in fact too Dark for school, though, or you'd look silly.

On the other hand it doesn't seem to be an Eliezer coinage, which would have made this line of thinking a bit more likely. "Dark Side epistemology" is, but has a narrow enough meaning that I'm not inclined to suspect shenanigans.

Comment author: JoshuaZ 14 December 2012 06:00:31AM 1 point [-]

Well, one could certainly learn from the dark arts threads what not to do and what to be aware of to watch out for.

Comment author: HalMorris 14 December 2012 04:19:06PM 1 point [-]

Well, yeah, my point exactly to reiterate from elsewhere

[I'm interested in] spreading dark-art antibody memes, but you can't do that without taking a sample of the dark arts most prevalent at the moment, much as they must round up viruses every year to develop the yearly flu shot. So I wouldn't be looking for "the best" dark arts but rather the ones one is likely to encounter. E.g. a good source would be Newt Gingrich's "Language: A Key Mechanism of Control" memo (http://www.informationclearinghouse.info/article4443.htm) EXCERPT:

"In the video 'We are a Majority,' Language is listed as a key mechanism of control used by a majority party, along with Agenda, Rules, Attitude and Learning. As the tapes have been used in training sessions across the country and mailed to candidates we have heard a plaintive plea: 'I wish I could speak like Newt.' That takes years of practice ..."

This introduces the famous word list: a list of smiley-face words to use when describing your own positions, and nasty-face words to use when putting words in the mouths of your opponents (or do I say 'enemies'?). Or there is the Paul Wyrich farewell letter which did much to propagate the meme "political correctness is cultural Marxism", or the Weyrich-inspired "The Integration of Theory and Practice: A Program for the New Traditionalist Movement" (http://therealtruthproject.blogspot.com/2011/02/integration-of-theory-and-practice.html), a document Lenin might have been proud of.

I'm all about blunting the effectiveness of certain tactics that reduce the possibility of our thinking clearly (and by "our", I mean not that of LW, or the Second Foundation, but of the whole mass of people whose votes determine who we get to have as President, etc.) ASIDE: One place where Thomas Jefferson was one of the least small-gov't-ish founding fathers was education, and he was also all about disempowering religion memes

NOTE: I don't mean to get onto politics per se - just practices that tend to turn it into a struggle between hidden conspiracies, but I think it's hopelessly abstract to try to discuss that without the aid of current examples.

Comment author: PatSwanson 03 December 2012 09:49:08PM *  3 points [-]

Hi!

I'm 29, and I am a programmer living in Chicago. I just finished up my MS in Computer Science. I've been a reader of Less Wrong since it was Overcoming Bias, but never got around to posting any comments.

I've been rationality-minded since I was a little kid, criticizing the plots and character actions of stories I read. I was raised Catholic and sent to Sunday school, but it didn't take and eventually my parents relented. Once I went away to college and acquired a real internet connection, I spent a lot of time reading rationality-related blogs and websites. It's been a while, but I'd bet it was through one of those sites that I found Less Wrong.

Comment author: wwa 23 November 2012 01:45:05AM *  7 points [-]

Hi!

Long time lurker here.

I'm 26 years old, CS graduate living in Wrocław (Poland), professional compiler developer, cryptography research assistant and programmer. I'm an atheist (quite possibly thanks to LW). I consider the world to be overall interesting. I have many interests and I always have more things to do than I have time for. I'm motivated by curiosity. I'm less risk-averse than most people around me, but also less patient. I have a creative mind and love chellanges. While being fairly successful lone wolf until now, I seek to improve my people skills because I belive I can't get much further all by myself.

When I found LW for the first time, it absorbed me. It took me about 4 months at 4-6h a day to read all of the Sequences and comments. While I strongly disagree with some of the material, I consider LW to have accelerated my personal developement 2 to 3 times simply by virtue of critical mass and high singal to noise ratio. I don't know any better hub for thought (links welcome!). I joined becuse I finally have something to say.

W.

Comment author: Swimmer963 23 November 2012 02:09:45AM 4 points [-]

Welcome!

I'm an atheist (quite possibly thanks to LW).

If you're interested in making a post, I bet lots of us would be interesting in hearing that story.

I have many interests and I always have more things to do than I have time for.

Join the club! It sounds like you've chosen a good career for someone who likes challenges, too.

It took me about 4 months at 4-6h a day to read all of the Sequences and comments. While I strongly disagree with some of the material, I consider LW to have accelerated my personal developement 2 to 3 times simply by virtue of critical mass and high singal to noise ratio.

Agreed–same for me. If anything, the Sequences that I've disagreed with were better for me, in terms of making me think...even if I still disagreed after thinking about it, they were mostly things I had never thought about to that degree of depth before.

Comment author: therufs 29 September 2012 07:48:25PM 1 point [-]

Might one respectfully request an edit with link to the newest welcome post here? I found the newer one rather by accident.

Comment author: RogerG 21 August 2012 02:42:45PM *  3 points [-]

Hi Everyone,

I came across this website, LessWrong, from a philosophy forum. I'm new to this type of thing. I'm not a writer, nor a philosopher, but only someone that is interested in knowing the real truths, whether good, bad, or ugly. It appears to me that most people seem to believe in that which is most palatable to them, that which makes them feel best. I think I am different.

As I see it, all of reality exists ‘only’ from within my mind. All that I know about ‘anything’ come from the thoughts and feelings within my mind. Without thoughts and feelings, I don’t really exist, or at least I wouldn’t know it if I did. The only reference point to experience reality comes from only within my mind, and nowhere else. That is all I have to work with. There are very many things in life to ponder deeply upon, and many of which I have already jumped into. But now I must get out and relook at where I am jumping. Before jumping into any of these again, it makes sense that I back up, way back, to the pondering machine itself, my mind. If reality truly is a figment of my mind, then it makes sense to ‘first’ try to understand and validate my mind (thoughts and feelings) before jumping into the middle of trying to understand any of life’s big (or small) questions. How do they (thoughts and feelings) come about? Where do they come from? Can I trust them? If these cannot be trusted, then it would be truly senseless for me to try to understand anything. Should we just trust our thoughts and feelings without question? Why? Or are these fair-game to be studied? Since there are a large variety of views, understanding, and beliefs by many people, of many questions in life, it seems obvious to me that not everyone’s thoughts and feelings are valid. Whose are valid, whose are not?

Anyways, I'm hoping to learn lots from you all, --RogerG

Comment author: seagreen 17 August 2012 12:23:38PM 1 point [-]

Hey everyone!

I'm a programmer from the triangle area on the east coast. I'm interested in applied rationality through things like auto-analytics.[1] I'm also interested in how humans can best adapt to information technology. Seriously, people, this internet thing? It is out there!

From what I gather of LW stereotypes my personal life is so cliche I'm not even going to bother. Uh, I think tradition is kind of important? I guess that makes me kind of unique . . .

[1] Specifically I'm interested in getting a standardized database format for things like food consumed, exercise, time spent, etc. Once we have that centralized apps could be broken up into publishing, storage, and analysis functions, which would have some huge advantages over the current system. For one thing non-technical users wouldn't have to be scared of getting their data locked into an obsolute format. For another it would be easier to try out new systems. If this idea interests you (or you think it sucks and are willing to explain why), let me know!

Comment author: BrianLloyd 15 August 2012 08:34:41PM 8 points [-]

Hello; my name is Brian. It is with some trepidation that I post here because I am not entirely sure how or where I can contribute. On the other hand, if I knew how I could contribute then I probably wouldn't need to post here.

I seem to be a bit older than most people whose introductions I have read here. I am 58. I have spent most of my life as a software engineer, electrical engineer, technical writer, businessman, teacher, sailor, and pilot. (When I was young Robert A. Heinlein advised against specialization, an admonition I took to heart.)

My most recent endeavor was a 5-year stint in a private school as a teacher of science, math, history, government, engineering, and computer science/programming. The act of trying to teach these subjects in a manner that provides the necessary cross-connection caused me to discover that I needed to try to understand more about how I think and learn, as my ultimate goal was to help my students determine for themselves how they think and learn. Being able to absorb and regurgitate facts and algorithms is not enough. Real learning requires the ability to discover new understanding as well. (I am rather a fan of scientific method, as inefficient as it may be. Repeating an experiment is never bad if it helps you to cement understanding for yourself. Besides, you might discover the error that invalidates the experiment.)

So, now I have become interested in rational thought. I want to be able to cut to the meat of the issue and leave the irrational and emotional behind. I want to be better able to solve problems. Like Lara, I have also recently given up the search for religious enlightenment. It took time looking at my own assumptions to finally come to the conclusion that there is apparently no rational basis for religion ... as we know it. (I guess that makes me an atheistic agnostic?)

So, it is clearly a time for a change. I look forward to learning from you.

(English really does need a clear plural for the pronoun 'you'.)

Brian

Comment author: OrphanWilde 15 August 2012 08:38:02PM 1 point [-]

Y'all!

There's an added bonus in that it annoys linguistic purists.

Comment author: BrianLloyd 15 August 2012 09:17:38PM 2 points [-]

Until Y'all degenerates into the singular and then you need a plural for the plural, i.e. "all y'all." Don't believe me? Go to Texas. ;-)

Comment author: SamuelHirsch 13 August 2012 07:20:57AM 4 points [-]

Hello!

I am joining this site as a senior in Engineering Science (most of my work has biomedical applications) in college. I am 22 years old, and despite my technical education, have less online presence (and savvy) than my Aunt's dog. As a result, I apologize in advance for anything improper I may do or cause.

Some personal background: I grew up in the Appalachian foothills of northwestern New Jersey, USA with two brothers in a (mildly observant, Conservative) Jewish household. I mention this because the former explains my insular upbringing, as opposed to the latter, which was the main encouragement for me to reach out to this site and others in an effort to better rationalize my own beliefs and world-view. These relative causes and effects appear to be somewhat unique from what I've observed in casual conversation with others, as well as a brief skimming of this site before I realized I simply had to join it. (Forgive my squee as I step into the unknown of online forums and blogs.)

Where I am (or would like to be) headed: I will be working as an EMT until I can get the few post-bacc credits needed before I apply to medical school. Those credits may stretch into a Masters in BioMedical Engineering, but that is still up for grabs. For whatever reason, the race consciousness' need for progeny runs strong in me, although I'm not picky on if the children come from my genes, so long as it's legal. :). The reason I mention this, is that one of the most pressing issues I am currently facing is determining whether the girl I've been seeing for several years is the one. Please, do not feel compelled to respond with date tips - I only included this information as this selection is one of the driving forces behind my search for more logical and rational thinking.

(What a segue! I'm getting better at this introduction as it continues.)

Why I am here: Ha, I wish I could answer that question. But really, the reason I came to Less Wrong can not be pointed at any one issue, although there are some stronger points. One, I've just mentioned. Another can be pointed at my belief system. (It may have the trappings of religion, and I may have been the Religious Affairs Liaison to my university's student government, but I dislike that word for reasons longer than I can enumerate at this moment.) Simply put, I was unsatisfied with my religious (note!) upbringing's ability to explain my experiences, so I 'checked out' many belief systems until I ironically persuaded myself into my current situation of being a more .....devout/observant/adjective-that-doesn't-call forward-the-word-psychotic Jew than anyone else in my family. Certainly, I welcome any discussion on the topic, both because I wouldn't want to dissuade anyone from speaking openly to me and because this is still in a state of flux. That is, infact how I arrived at the site, when I followed a link while searching for a personal chavruta. My third and final motivation that I'll mention is that I simply and truly wish to clarify my own side while directly understanding others' in all aspects of my life. This is hardly new to me, but I've only recently learned the tools for self-improvement may be found outside the mind and I am thus reaching out to you.

Ultimately, I hope to get out of this site as much as I put into it (which I plan to be a lot). As you watch me grow, don't hesitate to correct me. I will certainly make an effort to ensure my future posts are not as long, nor as full of paranthetical comments. (Although really, I come from a not easily summarized background, and between being easily distracted and recently filling out application forms with limited characters, I just couldn't help myself.) I honestly am honored that anyone is even reading this far down into my words, as they're the first I've ever posted and I realize I've gone on quite enough. In that spirit, thank you all so mucb for your time and contributions across this site. I look forward to getting to know you, myself, and maybe even some online etiquette. Goodnight to all,and to all a good night.

Yours, SamuelHirsch (Samuel on COW)

Comment author: SamuelHirsch 13 August 2012 07:40:55AM 1 point [-]

This is probably a tremendous faux pas but after waking up my girlfriend (work at 4am), I realized I could potentially make myself look less idiotic and stave off great frustration while risking the wrath of self-commenting haters. To wit, I did in fact know Less Wrong existed but wrongly assumed that it was a forum for self-aggrandizement, where one could simply type enough large words and be thought correct, rather than a platform for self-betterment. The irony in that sentence notwithstanding, this prejudice against bouncing ideas and methods of analysis off other people has held me back in the past. I will do my best to overcome it, both here and elsewhere. Thanks for your patience - I hope that provided a little insight into some of my limitations as I move forward.

Comment author: MikoMouse 13 August 2012 12:00:52AM 4 points [-]

Hullo everyone

It's nice to be here. I think. I'm not quite sure about any of this but, hope to be able to understand it someday. If not soon. Hopefully this site will be able to broaden my mind and help with my dismal opinion of the world and it's people as of late.

My name is Tamiko, or Miko if you prefer. I have been living in Southern California for the last 12 years and am currently 17 and a half years old. Recently I have been reading a certain fan-fic called Harry Potter and the Methods of Rationality. That is what lead me to this site. What pulled me in though is the concepts that this site promoted. I want the truth and all it entitles. I am curious and will not be satisfied until I have the answers to most if not almost all of my inquiries.

I hope we can all work together to make this world better. Thank you all for your time.

Comment author: [deleted] 11 August 2012 04:54:04AM *  8 points [-]

Hello,

I am a nearly seventeen year old female in the US who was linked by a friend to The Quantum Physics Series on LessWrong after trying to understand whether or not determinism is /actually/ refuted by Quantum Mechanics. I am an atheist, I suppose.

This all began as a fascination with science because I thought it would permit me to attain ultimate knowledge, or ultimate understanding and thus control of "matter". Later, I became fascinated with nihilism and philosophy, in search of defining "objectivity". It took off from there and now I am currently concerned with consciousness and usage of artificial intelligence to transfer our biological intelligence to a more effective physical manifestation.

I'm a little scared, naturally, because I think this would change a lot of what we currently understand as humans. As Mitchell Heisman describes, there exists a relationship between the scientist and the science. If the scientist is changed, I would think that the science, or knowledge, would in itself change. Some questions I have ATM: "Does objectivity exist? Can it be created? Can the notion or belief or idea of objectivity be destroyed? Will intelligence become disinterested in the ideas we are currently interested in and live in a universe free from these ideas and knowledge; can it perhaps eliminate knowledge rather than be ignorant of it? Will objectivity become so irrelevant as to not exist (as a possibility in our think-space)?"

So, I wonder, why, if so, is immortality more valuable than mortality?

I enjoy thinking about things, discovering new thoughts. I still have a lot of factual refining to do and I'm actively searching for resources to help me accomplish this. Thus I find myself here on lesswrong.org.

Comment author: Mitchell_Porter 11 August 2012 07:03:48AM 1 point [-]

Hello. I think you are the first person I've ever seen cite Mitchell Heisman as if he was just another thinker, rather than a weird guy who forced his ideas upon the attention of the world by committing suicide.

You're interested in the concept of "objectivity". It's certainly a crossroads concept where many issues meet. Maybe the major irony in the opposition between "objectivity" and "subjectivity" is that objectivity is a form of subjectivity! Here subjectivity is more or less a synonym for consciousness, and a subjectivity is a sensibility or a mindset: a state of mind in which the world is experienced and judged in a particular way.

Consciousness is a relation between an experiencing subject and an experienced object, and objectivity is consciousness trying to banish from its perceptions (or cognitions) of the experienced object, any influences which arise from the experiencing subject. In a lot of modern scientific and philosophical thought, this has been taken to the extreme of even trying to escape the existence of an experiencing subject.

Trying to catalogue and diagnose all the ways in which this happens would be a mammoth task, but one extreme form of the syndrome would be where the "scientific subject" achieves perfect unconsciousness of self, and exists in a subjective world that seems purely objective. That is, they would have a belief system that nothing exists but atoms (for example), and not only would they find a way to interpret everything they experience as "nothing but atoms", but they would also manage to avoid noticing their own mental processes in a way that would disturb this perception, by reminding them of their own existence.

A more moderate state of mind would be one in which self-awareness is allowed, but isn't threatening because the thinker has some way of interpreting their thoughts, and their thoughts about their thoughts, as also being nothing but atoms. For example, the brain is a computer, and a thought is a computation, and the computation has a "feeling" about it, and consciousness is made up of those feelings. A set of beliefs like that would be far more characteristic of the average materialist, than the previous extreme case, and it's also likely to be healthier, because the evidence of the self's existence isn't being repressed, it's just being interpreted to make it consistent with the belief in atomism.

The phenomenon of a personal existential crisis arising from equating objectivity with nihilism via "life has no objective meaning", is not something I remember ever experiencing, and I can't identify with it much. I can understand people despairing of life because it's bad for them and it won't stop, or even just doubting its value because their hopes have burned away, so it's not bad but it's not good either, it's just empty. But apparently I was never one of those people who thought life wouldn't be worth living if I couldn't find an objective morality or an objective meaning or an objective reason for living. This outlook seems to be a little correlated with people who were raised religious and then became atheists (I was raised as an agnostic), and I would think that sometimes the feeling of meaninglessness is more personal in origin than the one who experiences it realizes. In the religious case, for example, one may suppose that they felt personally uplifted back when they thought that reality had a purpose and this purpose included eternal life for human beings; so it may seem that the problem is one of there being "no objective purpose", but really the problem has more to do with the change in their personal ontological status.

I mention this because I think that there are "existential disorders" experienced by modern people which also have their origin in the belief in a scientific universe that doesn't contain subjects or subjectivity. Again, the forms are multitudinous and depend on what science is thought to be saying at the time. People having a crisis over epiphenomenalism are different from people having a crisis over "all possible things happen in the quantum multiverse". You don't say you're having a crisis, but there's a disturbing dimension to some of what you think about, and I would bet that it arises from another aspect of the attempt to "be objective" when "objectivity" seems to imply that you don't or can't exist, don't have any personal agency, or wherever it is that the scientific outlook seems to contradict experience.

I have been promoting phenomenological philosophy in discussions elsewhere, and phenomenology really is all about being objective about subjectivity. In other words, one is not taking one's consciousness and purging all evidence of its subjective side, just in order to be consistent with an imagined picture of reality. It's more like how western culture imagines Buddhism to be: you attend to your thoughts and feelings as they arise, you do not repress them and you do not embrace them. But the goals of phenomenology and of Buddhism are a little different - Buddhism is ultimately about personal salvation, removing oneself from the world of suffering by allowing attachments to reveal their futility; whereas phenomenology is more purely scientific in spirit, it's an attempt just to conceive the nature of consciousness correctly and objectively.

You mention artificial intelligence and possibly mind uploading. These days, the standard view of how the mind fits into nature is the computational one - the mind is a program running in the brain - with a bit of stealthy dualism allowing people to then think of their experiences as accompanying these computations; this is how the "moderate materialist", in my earlier description, thinks. Naturally, people go on from this to suppose that the same program running on a different computer would still be conscious, and thus we get the subculture of people interested in mind uploading.

Long ago I carried out my own investigations into phenomenology and physics, and came to disbelieve in this sort of materialist dualism. The best alternative I found came from entanglement in quantum theory. With entanglement, you have a single complicated wavefunction guiding two or more particles that can't be split into a set of simpler wavefunctions, one for each particle. (When the joint wavefunction can be split in this way, it's called "factorizable", it factorizes into the simpler wavefunctions.) There is some uncertainty about the reality implied by the equations of quantum mechanics, to say the least. One class of interpretations explains entanglement by saying that there are "N" different objects, the particles, and they just interact to produce the correlations. But another class of interpretations say that when you have entanglement, there's only one thing there, though it may be "present" in "N" different places.

My best idea about how consciousness works is that, first of all, it is the property of a single thing, a big entangled object in the sense of the second interpretation. Refining that hypothesis to make it detailed and testable is a long task, but I can immediately observe that it is in contradiction to the usual idea of mind uploading, according to which your mind is physically a large number of distinct parts, and it can be transferred from one place to another by piecemeal substitution of parts, or even just by creating a wholly new set of parts and making it behave like the old set. If a conscious mind is necessarily a single physical thing, all you can do is just move it around, you can't play the game of substituting transistors for neurons one at a time. (Well, if the "single physical thing" was a big bundle of entangled electrons, and neurons and transistors just host some of that bundle, then maybe you could. But the usual materialist conception of the mind, at work in this thought experiment of substitution, is that the mind is made of the neurons.)

I'm already notorious on LW for pushing phenomenology and this monadic idea of mind, and for scorning the usual approach as crypto-dualist, so I won't belabor the point. But it seems important that you should know there are radical conceptual alternatives, if you're engaged in these enjoyable-yet-scary meditations on the future of intelligence. The possibilities are not restricted just to the ideas you will find readymade in existing futurist discourse.

Comment author: erbeeflower 06 August 2012 07:37:35PM 8 points [-]

Hello people, 49 year old father of 4 sons, 17-27, eldest of 9,i come from a background of mormonism, my parents having been converted when i was 3.

So my reality was the dissonance of mormon dogma and theology vs what i was being 'taught' at school,vs what i experience for my self.

Now, having been through the divorce of my parents(gross hypocrisy if you're a mormon) the suicide of my brother and my own divorce,also finding myself saying i would die/kill for my beliefs,i began to realise what a mess i was and started asking questions,leaving the church (demonstrating with placards every sunday for 2 years) in 1996.

So i found myself wanting and needing a new philosophy! I'm particularly interested in learning how to 'be less wrong'! I'm still looking around and am currently interested in the non aggression principle.

I look forward to learning the tools i see here,so that i may make more considered choices.I recognise i'm a clumsy communicator and probably i'm somewhat retarded in comparison to a lot of you. Anyway i look forward to watching and learning,maybe even contributing one day! Tim.

Comment author: Dolores1984 06 August 2012 08:05:53PM 1 point [-]

Hello, Tim! Welcome to Less Wrong. Don't be too impressed, we're all primates here. If you're interested in learning about the cognitive tools people use here, I recommend reading the sequences. They're a little imposing due to sheer length, but they're full of interesting ideas, even if you don't fully agree. Best of luck, and I hope you find something of value here.

-Dolores

Comment author: Crystalist 03 August 2012 09:57:07AM 2 points [-]

Hi all,

Long time lurker, first time poster. I've read some of the Sequences, though I fully intend to re-read and read on.

I'm an undergrad at present, looking to participate in a trend I've been observing that's bring some of the rigor and predictive power of the hard sciences to linguistics.

I'm particularly interested in how language evolved, and under what physical/biological/computational constraints; What that implies about the neural mechanisms behind human behavior; and how to use those two to construct a predictive and quantitative theory of linguistic behavior.

I go to a Liberal Arts college (I started out with a bit more of a Lit major bent), where, after being disillusioned with the somewhat more philosophical side of linguistics (mid-term, no less), I ended up taking an extracurricular dive into the physical sciences just to stay sane. Then a friend recomended HPMOR, and thence I discovered LessWrong, where I've been happily lurking for some time.

I decided it would be useful to actually participate. So here I am.

Comment author: Petra 31 July 2012 07:10:11PM *  7 points [-]

Hello!

I'm 18, an undergraduate at University of Virginia, pre-law, and found you through HPMOR.

Rationality has been a part of me for almost as long as I can remember, but for various reasons, I'm only recently starting to refine and formalize my views of the world. It is heartening to find others who know the frustration of dealing with people who are unwilling to listen to logic. I've found that it is difficult to become any better at argument and persuasion when you have a reputation as an intelligent person and can convince anyone of anything by merely stating it with a sufficiently straight face.

More than anything else, I hope to become here a person who is a little less wrong than when I came.

Comment author: John_Maxwell_IV 02 August 2012 01:28:46AM 7 points [-]

This "intelligent reputation" discussion is interesting.

I had kind of an odd situation as a kid growing up. I went to a supposedly excellent Silicon Valley area elementary school and was generally one of the smartest 2-4 kids in my class. But I didn't think of myself as being very smart: I brushed off all the praise I got from teachers (because the villains and buffoons in the fiction I read were all arrogant, and I was afraid of becoming arrogant myself). Additionally, my younger brother is a good bit smarter than me, which was obvious even at that age. So I never strongly identified as being "smart".

When I was older I attended a supposedly elite university. At first I thought there was no way I would get in, but when I was accepted and got in I was astonished by how stupid and intellectually incurious everyone was. I only found one guy in my entire dorm building who actually seemed to like thinking about science/math/etc. for its own sake. At first I thought that the university admissions department was doing a terrible job, but I gradually came to realize that the world was just way stupider than I thought it was, and assuming that I was anything close to normal was not an accurate model. (Which sounds really arrogant; I'm almost afraid to type that.)

I wonder how else being raised among those who are smarter/stupider than you impacts someone's intellectual development?

Comment author: Petra 02 August 2012 01:45:43AM 1 point [-]

generally one of the smartest 2-4 kids in my class

This is interesting. Do you think your aversion to what you saw as arrogance, but which turned out to be (at least partially) accuracy, might have been overcome earlier if, for example, you'd been the clear leader, rather than having even a small group you could consider intellectual peers? Was that how you saw them?

Comment author: John_Maxwell_IV 02 August 2012 02:05:02AM *  3 points [-]

It's possible. Although for me to have been the "clear leader" you probably would've had to remove a number of people who weren't in the top 2-4 as well. And even then I might have just thought of my family as unusually great, because there'd still be my terrifyingly smart younger brother.

Silicon Valley could be an odd place. I actually grew up in a neighborhood where most of the kids were of Indian descent (we played cricket and a game from India that I just found on Wikipedia called Kabaddi (I can't believe this is played professionally) in addition to standard US games). I didn't think to ask then, but I guess they were mostly children of immigrant software engineers? I haven't really lived anywhere other than the SF bay area yet, so I don't have much to compare it to. Right now I'm thinking I should prepare myself for way more stupidity and racial homogeneity.

Comment author: wedrifid 02 August 2012 04:06:58AM 1 point [-]

Silicon Valley could be an odd place. I actually grew up in a neighborhood where most of the kids were of Indian descent (we played cricket and a game from India that I just found on Wikipedia called Kabaddi (I can't believe this is played professionally) in addition to standard US games).

It took me a few seconds pondering the playing of cricket as 'odd' to realize that I need to identify with the Indians in this story.

Comment author: [deleted] 01 August 2012 12:21:02PM 5 points [-]

I've found that it is difficult to become any better at argument and persuasion when you have a reputation as an intelligent person and can convince anyone of anything by merely stating it with a sufficiently straight face.

Or even without a straight face. Sometimes I've made wild guesses (essentially thinking aloud) and, no matter how many “I think”, “may”, “possibly” etc. I throw in, someone who has heard that I'm a smart guy will take whatever I've said as word of God.

Comment author: Petra 01 August 2012 04:26:58PM 3 points [-]

Yes. My personal favorite was in middle school, when I tried to dispel my assigned and fallacious moniker of "human calculator" by asking someone to pose an arithmetic question and then race me with a calculator. With a classroom full of students as witnesses, I lost by a significant margin, and not only saw no lessening of the usage of said nickname, but in fact heard no repeating of the story outside of that class, that day.

Comment author: DaFranker 01 August 2012 05:54:44PM *  6 points [-]

Beware indeed of giving others more bouncy walls on which evidence can re-bounce and double-, triple-, quatruple-, nay, Npple-count! I once naively thought to improve others' critical thinking by boosting their ability to appraise the quality of my own reasoning.

Lo' and behold, for each example I gave of a bad reasoning I had made or was making, each of them was inevitably using this as further evidence that I was right, because not only had I been right much more than not (counting hits and arguments are soldiers and all that), but the very fact that I was aware of any mistakes I was making proved that I could not make mistakes, for I would otherwise notice mistakes and thus correct myself.

TL;DR: This remains a profoundly important unsolved problem in large-scale distribution, teaching and implementation of cognitive enhancement and bias-overcoming techniques. It's even stated in Luke's "So you want to save the world" list of open problems as "raising the sanity waterline", a major strategic concern for ensuring maximal confidence of results in this incredibly absurd thing they're working on.

Comment author: Cyan 01 August 2012 07:51:28PM 3 points [-]

Npple

The term in common usage is "n-tuple".

Comment author: DaFranker 01 August 2012 08:09:27PM 3 points [-]

Thanks. I paused for a second when I was about to write it, because I realized that I wasn't quite sure that that was how I should write it, but decided to skip over it as no information seemed lost either way and it had bonus illustrative and comical effect in the likely event that I was using the wrong term.

Comment author: wedrifid 02 August 2012 04:12:37AM *  5 points [-]

but decided to skip over it as no information seemed lost either way and it had bonus illustrative and comical effect in the likely event that I was using the wrong term.

Given all the evidence on 'bouncy' and 'npple-count' I must admit the comic illustration that sprung to mind may not have been the one you intended!

Comment author: [deleted] 01 August 2012 10:22:35PM 2 points [-]

Well... I just started to refuse to make calculations in my mind on demand, and I think I even kind-of freaked out a couple times when people insisted. It worked.

Comment author: TheOtherDave 01 August 2012 05:48:23PM 2 points [-]

I try to keep this sort of thing in mind when interpreting accounts of the implausible brilliance of third parties.

Comment author: TheOtherDave 31 July 2012 09:02:48PM 4 points [-]

it is difficult to become any better at argument and persuasion when you have a reputation as an intelligent person and can convince anyone of anything

Yeah, pretty much.

It is sometimes useful, at that point, to put aside the goal of becoming better at argument and persuasion, and instead pursue for a while the goal of becoming better at distinguishing true assertions from false ones.

Comment author: DaFranker 31 July 2012 07:37:31PM *  3 points [-]

Interestingly, the Authority Card seems subject to the Rule of Separate Magisteria. I'm sure you've also noticed this at some point. Basically, the reputedly-intelligent person will convince anyone of any "fact" by simply saying it convincingly and appearing to themselves be convinced, but only when it is a fact that is part of the Smart-person Stuff magisterium within the listener's model. As soon as you depart from this magisterium, your statements are mere opinion, and thus everything you say is absolutely worthless, since 1/6 000 000 000 = 0 and there are over six billion other people that have an opinion.

In other words, I agree that it constitutes somewhat of a problem. I found myself struggling with it in the past. Now I'm not struggling with it anymore, even though it hasn't been "solved" yet. It becomes a constant challenge that resets over time and over each new person you meet.

Comment author: Petra 31 July 2012 07:56:12PM 2 points [-]

Of course, as a young person, this obstacle is largely eliminated by the context. Interact with the same group of people for a long period of time, a group through which information spreads quickly, and then develop a reputation for knowing everything. Downside: people are very disappointed when you admit you don't know something. Upside: life is easier. More important downside: you get lazy in your knowledge acquisition.

Comment author: [deleted] 01 August 2012 12:23:53PM 2 points [-]

Downside: people are very disappointed when you admit you don't know something.

This. Sometimes, when I tell people I don't know how to help them with something, they accuse me of being deliberately unhelpful with them because I'm selfish, angry with them, or something.

Comment author: beoShaffer 31 July 2012 07:37:20PM 3 points [-]

Hi Petra! Minor nitpick, its rationality not rationalism. Rationalism is something completely different.

Comment author: Petra 31 July 2012 07:52:01PM 5 points [-]

Pardon me, that falls into the grey area between typo and mistake, where the word in the brain doesn't come out on the page. I will correct it.

Comment author: [deleted] 31 July 2012 10:10:09PM 1 point [-]

Why the hell was that downvoted???

Comment author: DaFranker 31 July 2012 11:12:00PM 2 points [-]

My most reasonable guess:

Because every cause wants to be a cult, and some unwary cultists of LessWrong could very easily fool themselves into thinking that any nitpicking over the use of similar words is misinterpretation of the Holy Sequence Gospel, because the Chapter of Words Used Wrong clearly states that words are meant to communicate and clarify ideas and meanings, and thus follows that arguing over words instead of arguing over their substance is inherently bad.

Comment author: DaFranker 31 July 2012 11:25:12PM *  3 points [-]

Judging from the immediate downvote, I'll throw in a second guess that I might be doing some cultist preaching myself there.

Comment author: JohnEPaton 30 July 2012 01:44:38AM 7 points [-]

Hello,

My name is John Paton. I'm an Operations Research and Psychology major at Cornell University. I'm very interested in learning about how to improve the quality of my thinking.

Honestly, I think that a lot of my thoughts about how the world works are muddled at the moment. Perhaps this is normal and will never go away, but I want to at least try and decrease it.

At first glance, this community looks awesome! The thinking seems very high quality, and I certainly want to contribute to the discussion here.

I also write at my own blog, optimizethyself.com

See you in the discussion!

-John

Comment author: TomA 28 July 2012 01:45:26AM 4 points [-]

I am a retired engineer with an interest in game theory modeling. This blog site appears to offer a worthwhile trove of information and access to feedback that can be useful. I look forward to participating.

Comment author: Erdrick 26 July 2012 03:46:34AM 6 points [-]

Greetings fellow Ration-istas!

First of all, I'd like to mention how glad I am that this site and community exist. For many years I wondered if there were others like me, who cared about improving themselves and their capacity for reason. And now I know - now I just need to figure out how to drag you all down to sunny San Diego to join me...

My name is Brett, and I'm a 28 year old Computational Biologist in San Diego, California. I've thought of myself as a materialist and an atheist since my freshman year in college, but it wasn't until after I graduated that I truly began to care about rationality. I realized that though I was unhappy with my life, as a scientist I had access to the best tools around for turning that around - science and reason.

I was born with a de novo genomic translocation on my 1st chromosome that left me with a whole raft of medical problems through-out my childhood - funnel chest, cleft palate, mis-fused skull, you name it. As a result I was picked on and isolated for most of my childhood, and generally responded to stress by retreating into video games and SF novels. So I went to school to study genetics and biology, and I graduated from college with a love of science - but also mediocre grades, a crippling EverQuest/World of Warcraft addiction, and few friends.

I suffered alone through a few months of a job that I hated before realizing I could use reason to improve my lot. And life has been one long, slow improvement after another ever since. Now I've got friends, a Master's in an awesome since, and a job that I enjoy... the only thing I was lacking was a community to discuss further improvements to myself and my capacity for reason to.

Then one of my most rationally minded friends pointed me towards Less Wrong and the Methods of Rationality in May, and here I am.

/b/

P.S. Barring a mass exodus to SD, I've also been considering moving to SF/SJ to be closer to friends and the LW meetups, assuming I could find work there. Does anyone know of any openings for a Bioinformaticist or Computational Biologist in the Bay by chance?

Comment author: candyfromastranger 28 July 2012 02:01:18AM 3 points [-]

A lot of people that I know seem to think that logic and reason are mostly just important in science, but they can improve so much in everyday life.

Comment author: Kevedes 25 July 2012 05:48:39PM 11 points [-]

Hello Everyone,

This is an interesting site! I found it in the recent New York Observer article about the Singularity.

I've been a huge fan of the Sciences my entire life (primarily Biology, but more recently physics and mathematics) and like to think of myself as rationalist, although I have doubts about it's limits. I'm also a playwright, comedian and musician.

I was loosely raised Greek Orthodox, and although it never really took hold, I think this explains why I really like Nikolai Gogol. I'd consider myself a de-facto atheist with a strong intuitive (faith-based? 'infinite resignation') streak. A few years back I had a 'religious revelation' and it took me quite some time to come to terms with what exactly happened to/in/through me. I now semi-jokingly refer to myself as a Born-Again Secular Humanist.

This seems like an interesting place to meet people and discuss ideas. Thanks for existing!

-Kevin (Kevedes)

Comment author: [deleted] 25 July 2012 06:53:19PM 3 points [-]

I know after reading this post, one of the first things I thought was that I wanted to read the article you mentioned. So I went and found the article and have linked it below in case any one else wanted to read it as well.

http://betabeat.com/2012/07/singularity-institute-less-wrong-peter-thiel-eliezer-yudkowsky-ray-kurzweil-harry-potter-methods-of-rationality/

Thanks for referencing it!

Comment author: thomblake 25 July 2012 08:06:46PM 2 points [-]

That is an awesome article - thanks for finding the link!

Comment author: Reiya 25 July 2012 01:39:00AM 4 points [-]

Hello! I found this site due to a series of links that impressed me and tickled my curiosity. It started out with an article an author friend of mine posted on FB about "Incognito Supercomputers and the Singularity". It points out a possible foreshadowing of the advent of avatars as written about in his and his brother's books.

I am female, 55 years old, and tend to let my curiosity guide me.

I call myself a spiritual atheist. It wasn't until I could reconcile my intangible (spiritual?) experiences with my ongoing discovery that religion's definition of god was useless to me that I could use the term atheist and feel like it fit. Ironically, I found myself outgrowing my religious upbringing (Mormon and born-again Christian) when I desired a more honest relationship with god. It took several years of paying attention to what lined up and held together, and noticing what no was no longer intellectually tenable that I first came to the realization I could no longer call myself a Christian. The change to atheism with Buddhist leanings was not very hard after that.

I have been a massage therapist for almost 20 years now. I also enjoying using the symbolism and synchronicity of astrology for spiritual and psychological points of view. I suspect that many spiritual experiences have to do with right brain functions. I am currently reading, FINGERPRINTS OF GOD, What Science is Learning About the Brain and Spiritual Experience, by Barbara Bradley Hagerty.

I honestly don't know much about logic and reason from a scientific or mathematical basis. I hope to change that as I can spend time here reading and listening and thinking and changing as needed. I suspect I am right brain dominant, and I learn in very different ways. Memorization is tricky for me, I learn best by doing and using my hands. It's a good thing I am a massage therapist.

Off the bat, I can say that I am delighted to see people willing to change as they get better data and I am appreciating the idea of Crocker's rules. It is sometimes impossible to really exchange ideas if one has to stop and mop up the offended feelings of someone who doesn't understand the exchange of information for it's own sake.

Thanks for doing this site and I'm looking forward to lurking for a while and then learning more about myself and others.

Comment author: Waterd 24 July 2012 01:54:52AM 1 point [-]

I came to this site in search for truth. Or at least find some people that will help me identify that which is real or true and that which is not. I think one of my tools to do that is to debate with other people in the seek for same things I am. Not many people are really interested about that imo, or are really educated to be able to help me as much as I need. Because this problem a friend of mine directed me to this site, where I should find those people. The huge problem here is how this community decides to trade information. This "Article/comment" Format Is AWFULL imo, compared to a forum. I really can't see how I can use this site for my benefits, even if it seems here should be people that would help me to do that. Is there a place LIKE THIS, but with the difference, in that there is a FORUM instead of this article/comment format? Thanks.

Comment author: fubarobfusco 24 July 2012 06:16:11PM 1 point [-]

What facts — aside from your personal familiarity — about a forum-style site do you think are beneficial?

Comment author: Benedict 22 July 2012 07:52:49PM 16 points [-]

Hey, I'm -name withheld-, going by Benedict, 18 years old in North Carolina. I was introduced to Less Wrong through HPMoR (which is fantastic) and have recently been reading through the Sequences (still wading through the hard science of the Quantum Physics sequence).

I'm here because I have a real problem- dealing with the consequences of coming out as atheist to a Christian family. For about a year leading up to recent events, I had been trying to reconcile Christian belief with the principles of rationalism, with little success. At one point I settled into an unstable equilibrium of "believing in believing in belief" and "betting" on the truth of religious doctrine to cover the perceived small-but-noteworthy probability of its veracity and the proposed consequences thereof. I'd kept this all secret from my family, putting on a long and convincing act.

This recently fell apart in my mind, and I confronted my dad with a shambling confession and expression of confusion and outrage against Christianity. I'm... kinda really friggin' bad at communicating clearly through spoken dialogue, and although I managed to comport myself well enough in the conversation, my dad is unconvinced that the source of my frustrations is a conflicting belief system so much as a struggle with juvenile doubts. This is almost certainly why I haven't yet faced social repercussions, as my dad is convinced he can "fix" my thinking. He's a paid pastor and theologian, and has connections to all the really big names in contemporary theology- having an apostate son would damage both his pride and social status, and as such he's powerfully motivated to attempt to "correct" me.

After I told him about this, he handed me a book (The Reason for God by Timothy Keller) and signed himself up as a counselor for something called The Clash, described as a Christian "worldview conference". Next week, from July 30 to August 3, he's going to take me to this big huge realignment thing, and I'm worried I won't be able to defend myself. I've been reading through the book I mentioned, and found its arguments spectacularly unconvincing- but I'm having trouble articulating why. I haven't had enough experience with rationalism and debate to provide a strong defense, and I fear I'll be pressured into recanting if I fail.

That's why I'm here- in the upcoming week, I need intensive training in the defense of rationality against very specific, weak but troubling religious excuses. I really need to talk to people better trained than me about these specific arguments, so that I can survive the upcoming conference and assert my intellectual independence. Are there people I can be put in touch with, or online meetups where I can talk to people and arm myself? Should I start a discussion post, or what? I'm unfamiliar with the site structure here, so I could use some help.

Oh but dang if there aren't like over a thousand comments here, jeez i don't want to sound like i'm crying for attention but i'm TOTALLY CRYING FOR ATTENTION, srsly i need help you dudes

Comment author: John_Maxwell_IV 02 August 2012 01:41:47AM *  1 point [-]

Hey, I agree with what wedrifid said. I fell in to the same trap of trying to beat religious nonsense out of people as a kid. It's a very sexy thing to think about but it doesn't really get you anywhere, in my experience. My only additional advice is that you consider trying to make your "recapitulation" to Christianity convincing. For example, don't give in right away, and make up a story for where you went wrong and why you're a Christian again, e.g. "I thought that x, but now I see that y and z, so x is wrong. I guess maybe God exists after all."

Something to keep in mind when arguing with your dad (internally only): your dad is presenting you with evidence and arguments in favor of God's existence, but these amount to a biased sample. If you really want to know the truth, you should spend an equal amount of time hearing arguments from both Christians and atheists, or something like that.

Also, you can check internally if any of his arguments hold up to this test: http://commonsenseatheism.com/?p=8854

Comment author: wedrifid 23 July 2012 12:52:09AM *  22 points [-]

my dad is unconvinced that the source of my frustrations is a conflicting belief system so much as a struggle with juvenile doubts.

That is roughly speaking what juvenile doubts are. The "juvenile" mind tackling with conflicts in the relevant socially provided belief system prior to when it 'clicks' that the cool thing to do is to believe that you have resolved your confusion about the 'deep' issue and label it as a juvenile question that you do not have to think about any more now that you are sophisticated.

Next week, from July 30 to August 3, he's going to take me to this big huge realignment thing,

You clearly do not want to go. His forcing you is a hostile act (albeit one he would consider justified) but you are going along with it. From this, and from your age, I infer that he has economic power over you. That is, you live with him or he is otherwise your primary source of economic resources. I will assume here that your Best Alternative To Negotiated Agreement (BATNA) sucks and you have essentially no acceptable alternative to submission to whatever power plays your father uses against you. Regardless of how the religious thing turns out, developing your potential for independence is something that is going to be worthwhile for you. Being completely in the power of another sucks! Having other options---even if it turns out that you don't take them---raises the BATNA and puts a hard limit on how much bullshit you have to put up with.

Now, the following is what I would do. It may or not be considered acceptable advise by other lesswrong participants since it abandons some favourite moral ideals. Particularly the ones about 'lying' and 'speaking the truth no matter the cost'.

I haven't had enough experience with rationalism and debate to provide a strong defense

Providing a 'defense' would be a mistake, for the reasons Kawoomba describes. The people you are dealing with are not interested in rational discussion or Aumann agreement and you are not obliged to try yourself. They are there to overwhelm you with social and economic pressure into submitting to the tribe's belief system. Providing resistance just gives them a target to attack.

Honesty and trust is something people earn. These people have not earned your respect and candor. Giving people access to your private and personal beliefs makes you vulnerable and can allow them to use your words to do political and social damage to you, in this case by making everyday life difficult for you and opening you up to constant public shaming. Fortunately that is better than being stoned to death as an apostate but even so there is no rule of the universe that you must confess or profess beliefs when they will be used against you. It is usually better to keep things to yourself unless there is some specific goal you have that involves being forthright (even if that goal is merely convenience and a preference for openness in cases where the consequences are less dramatic than you face.)

Religion is not about literal beliefs about physics. They lie to themselves then lie to you. You can lie too! You understand belief in belief already. You understand that religious belief (and all equivalent tribal beliefs) are about uttering the correct in-group signals. Most people convince themselves that they believe the right thing and then say that thing they 'believe' out loud. Your main difference is that you haven't lied to yourself as successfully. But why should thinking rationally be a disadvantage? Who says that you must self sabotage just because you happened to let your far mode beliefs get all entangled with reality? Sincerity is bullshit. Say what is most beneficial to say and save being honest for people who aren't going to be dicks and use your words against you.

Brainwashing is most effective against those who most strongly resist. While it can take longer to brainwash people who firmly stake their identity on sticking to a contradicting belief, it is those people who resist strongest are most likely to remain brainwashed. Those that change their mind quickly to make the torture stop (where torture includies shaming and isolation from like minded people) tend to quickly throw off the forced beliefs soon after the social pressure to comply is removed. (Forget the source, is it in Caldini?) If you make confessing the faith some sort of big deal that must be fought then if your brain is more likely to rationalise that it must have been properly convinced if it was willing to make such a dramatic confession. The hazing effect is stronger.

Precommit to false confessions. Go into the brainwashing conference with the plan to say all the things that indicate you are a devout Christian who has overcome his doubts. Systematically lying isn't all that much of a big deal to humans and while it is going to change your beliefs somewhat in the direction of the lies the effect will be comparatively far, far weaker given that you know you are lying out of contempt and convenience.

Fogging is amazing. Have you ever tried to have a confrontation with someone who isn't resisting? I've tried, even roleplaying with that as the explicit goal and I found it ridiculously difficult. It takes an extremely talented and dedicated persuader to be able to continue to apply active pressure when you are giving them nothing to fight against. Frankly, none of the people you are likely to encounter, including your father, would be able to do that even if they tried. They just aren't that good. You don't want to be barraged with bullshit. Saying the bullshit back to them a couple of times makes the bullshit stop. No brainer.

Are there people I can be put in touch with, or online meetups where I can talk to people and arm myself?

Sure, but I suggest meeting with the likeminded people for your own enjoyment and so you don't develop the unhealthy identity of the lone outsider. That and rationalists know cool stuff and have some useful habits that rub off. Where do you live? Are there lesswrong meetups around?

Comment author: MixedNuts 22 July 2012 11:58:20PM 6 points [-]

Go in panic mode.

This conference is not just making a case that Christianity is correct and debating about it. It's bombarding you with arguments for six days, where you won't hear an argument against Christianity or if you do it'll be awkward rude dissent from people in inferior positions, where you won't be able to leave or have time alone to think, and where you're going against your will in the first place. This is time for not losing your mind, not time for changing it. Don't keep an open mind, don't listen to and discuss arguments, don't change your mind because they're right, don't let the atmosphere influence you. If it helps you can think of it as like being undercover among huge patriots and resisting the temptation to defect (and their ideology may be better than yours), or like being in a psychiatrist hospital and watching out for abuse when you know the nurses will try to convince you your reactions are psychiatrist symptoms (and they may well be).

So don't see anything at the conference as a social interaction or exchange of ideas. Your goals are to get out of there, to block everything out, to avoid attention, and to watch sharply for anything fishy. Block out the speakers, just watch the audience. If there's a debate be quiet and don't draw attention. If you're asked to speak, voice weak agreement, be vague, or pick peripheral nits. If you're asked to participate in group activities go through the motions as unremarkably as you can. At the socials be a bit distant but mostly your usual self when making small talk, but when someone starts discussing one of the conference topics pretend to listen and agree, smile and nod and say "Yes" and "Go on" and "Oh yeah, I liked that part" a lot. Lie like a rug if you must. Watch the social dynamics and the attitudes of everyone and anything that looks like manipulative behavior. You'll be bored, but don't try to think about any kind of deep topic, even unrelated (doing math and physics problems in your head are ok, anything with a social or personal component is not). Try to get enough sleep and to eat well. Enjoy the ice cream. Don't think about anything related to the conference for a couple weeks afterward.

This is only short-term, and it won't help with your father; you probably want to handle that afterwards separately.

Comment author: Bundle_Gerbe 23 July 2012 02:11:57AM *  3 points [-]

It does not sound to me like you need more training in specific Christian arguments to stay sane. You have already figured things out despite being brought up in a situation that massively tilted the scales in favor of christianity. I doubt there is any chance they could now convince you if they had to fight on a level field. After all, it's not like they've been holding back their best arguments this whole time.

But you are going to be in a situation where they apply intense social pressure and reinforcement towards converting you. On top of that, I'm guessing maintaining your unbelief is very practically inconvenient right now, especially for your relationship with your dad. These conditions are hazardous to rationality, more than any argument they can give. You have to do what MixedNuts says. Just remember you will consider anything they say later, when you have room to think.

I do not think they will convert you. I doubt they will be able to brainwash you in a week when you are determined to resist. Even if they could, you managed to think your way out of christian indoctrination once already, you can do it again.

If you want to learn more about rationality specific to the question of Christianity, given that you've already read a good amount of material here about rationality in general, you might gain the most from reading atheist sites, which tend to spend a lot of effort specifically on refuting Christianity. Learn more about the Bible from skeptical sources, if you haven't before you'll be pretty amazed how much of what you've been told is blatantly false and how much about the bible you don't know (for instance, Genesis 1&2 have different creation stories that are quite contradictory, and the gospels' versions of the resurrection are impossible to reconcile. Also, the gospels of Matthew and Luke are largely copied from Mark, and the entire resurrection story is missing from the earliest versions of Mark.) I unfortunately don't know a source that gives a good introduction to bible scholarship. Maybe someone else can suggest one?

Comment author: OnTheOtherHandle 23 July 2012 03:09:55AM 2 points [-]

I'm not sure how much specific atheist reading you've done, but I found this list to be very helpful at articulating and formalizing all those doubts, arguments and wordless convictions that "this makes no sense." This is also a handy look at what would be truly convincing evidence of the truth of a particular religion's claims. The rest of that author's website is also wonderful.

Comment author: TimS 23 July 2012 12:38:32AM 3 points [-]

Welcome. I'm sorry that you are in such an awkward situation with you family. In terms of dealing with this conference, I can only echo what MixedNuts said (except for the panicking part). I've always found this quote interesting:

Adulthood isn't an award they'll give you for being a good child. You can waste . . . years, trying to get someone to give that respect to you, as though it were a sort of promotion or raise in pay. If only you do enough, if only you are good enough. No. You have to just . . . take it. Give it to yourself, I suppose. Say, I'm sorry you feel like that, and walk away. But that's hard

We have every reason to think that children's beliefs have no momentum - the evidence is right in front of us, they change their minds so often for such terrible reasons. By contrast, the fact that I disagree with another adult is not particular strong evidence that the other person is wrong.

In other words, try to free yourself from feeling obligated to defend anything or feeling guilty for not engaging with those who wish to change your beliefs. You might consider explicitly saying "Social pressure is not evidence that you are right (or wrong)." If the people talking with you assert that they aren't using social pressure, then ask them to stop continuing the debate. Their willingness to leave is a victory for your emotional state, and their refusal is strong evidence that arriving at true beliefs is not really their goal - but the proper reaction to that stance is to leaving the conversation yourself, not try to win the "you are being rude" debate.

In short, maximizing your positive emotional state doesn't rely on winning debates. Your goal should be to avoid having them at all. (If you hadn't already read the book your father found, I would have suggested declining to do so).

Comment author: Kawoomba 22 July 2012 08:44:57PM 5 points [-]

Hi Benedict!

Bad news first: You will not be able to defend yourself. This is not because you're 18, it's not because you can't present your arguments in a spectacular fashion.

It is because noone will care about your arguments, they will wait for the first chance to bring some generic counter-argument, probably centering on how they will be there for you in your time of implied juvenile struggle, further belittling you.

And - how aggravating - this is actually done in part to protect you, to protect the relationship with your dad. With the kind of social capital, pride and identity that's on the line for your father, there is no way he could acknowledge you being right - he'd have to admit to himself that he's a phony in his own eyes, and a failure as a parent and pastor in the eyes of his peers.

To him it may be like you telling him he wasted his life on an imaginary construct, while for you it's about him respecting your intellectual reasoning.

Maybe the rational thing to do is not strive for something that's practically unattainable - being respected as an atheist on the basis of your atheist arguments - but instead focus on keeping the relationship with your parent intact while you go do your own thing anyways. Mutual respect of one's choices is great in a family, but it may not be a realistic goal given your situation, at least in respect to discussing god.

Good news: While this is such a defining issue for your father, is it a defining issue for you to tell your father publicly your new stance? How hard/easy would it be to let him continue with his shtick, retain the relationship, and still live your life as an open atheist for all intents and purposes - other than when with your family, where you can always act with mild disinterest?

Rational in this forum is mostly construed as "the stuff that works in optimising your terminal values". It is possible for you to be the "bigger man" here, depending on which of the above you value higher. But make no mistake - I doubt that you'll change anyone's opinion on god regardless.

Comment author: Grognor 23 July 2012 01:41:08AM *  2 points [-]

Hello, friend, and welcome to Less Wrong.

I do think you should start a discussion post, as this seems clearly important to you.

My advice to you at the moment is to brush up on Less Wrong's own atheism sequence. If you find that insufficient, then I suggest reading some of Paul Almond's (and I quote):

great atheology

If you find that insufficient, then it is time for the big guy, Richard Dawkins:

If you are somehow still unsatisfied after all this, lukeprog's new website should direct you to some other resources, of which the internet has plenty, I assure you.

Edit: It seems I interpreted "defend myself" differently from all the other responders. I was thinking you would just say nothing and inwardly remember the well-reasoned arguments for atheism, but that's what I would do, not what a normal person would do. I hope this comment wasn't useless anyway.

Comment author: Vaniver 22 July 2012 09:10:25PM 3 points [-]

Hey! I've got a pastor father too, but thankfully my atheism doesn't seem to be a big deal for him. (It helps that I don't live nearby.)

I think the "conflicting belief system" is, as I understand it, the right model. There's a Christian worldview, which has some basic assumptions (God exists, the Bible is a useful source for learning about God, etc.), and there's a reductionist worldview, which has some basic assumptions (everything can be reduced to smaller parts, experiments are a useful source for learning about reality, etc.), and the picture you can build out of the reductionist worldview matches the world better than the picture you can build out of the Christian worldview. (There are, of course, other possible worldviews.)

I would not put much hope into being able to convince the people at this event that they should be atheists; I wouldn't even hope to convince them that you should be an atheist. And so the question becomes what your goals are.

If you're concerned about recanting your atheism and meaning it, the main thing I can think of that might be helpful is the how to change your mind sequence. You can keep that model in mind and compare the experience you're undergoing to it- it's unlikely that they'll be using rational means of persuasion, and you can point out the difference.

Are there people I can be put in touch with, or online meetups where I can talk to people and arm myself? Should I start a discussion post, or what? I'm unfamiliar with the site structure here, so I could use some help.

Starting a post in discussion is an alright idea; it'll work well if you mention specific arguments that you want to have responses to.

Comment author: aaronde 21 July 2012 05:41:48AM 1 point [-]

Hello, everyone.

Recent college grad here from the Madison area. I've been aware of this site for years, but started taking it seriously when I stumbled upon it a few months ago, researching evidential (vs causal) decision theory. I realized that this community seriously discusses the stuff I care about - that really abstract, high-minded stuff about truth, reality, and decisions. I'm a math person, so I'm more interested in the theoretical, algorithmic side of this. I've been a rationalist since, at 15, I realized my religion was bunk, and decided I needed to know what else I was wrong about.

Comment author: Gaviteros 19 July 2012 06:40:52AM 1 point [-]

Hellow Lesswrong!

My name is Ryan and I am a 22 year old technical artist in the Video Game industry. I recently graduated with honors from the Visual Effects program at Savannah College of Art and Design. For those who don't know much about the industry I am in, my skill set is somewhere between a software programmer, a 3D artist, and a video editor. I write code to create tools to speed up workflows for the 3D things I or others need to do to make a game, or cinematic.

Now I found lesswrong.com through the Harry Potter and the Methods of Rationality podcast. Up unto that point I had never heard of Rationalism as a current state of being... so far I greatly resonate with the goals and lessons that have come up in the podcast, and what I have seen about rationalism. I am excited to learn more.

I wouldn't go so far to claim the label for myself as of yet, as I don't know enough and I don't particularly like labels for the most part. I also know that I have several biases, I feel like I know the reasons and causes for most, but I have not removed them from my determinative process.

Furthermore I am not an atheist, nor am I a theist. I have chosen to let others figure out and solve the questions of sentient creators through science, and I am no more qualified to disprove a religious belief than I would be to perform surgery... on anything. I just try to leave religion out of most of my determinations.

Anyway! I'm looking forward to reading and discussing more with all of you!

Current soapbox: Educational System of de-emphasizing critical thinking skills.

If you are interested you can check out my artwork and tools at www.ryandowlingsoka.com

Comment author: HBDfan 15 July 2012 11:43:08AM 3 points [-]

Hi! I have read for a while. I read HPMOR and enjoyed the sequences. I prefer not to say where I live.

Comment author: TGM 12 July 2012 12:30:56AM 3 points [-]

There appears to be two "Welcome to Less wrong!" blog posts. I initially posted this in the other, older one:

I’m 20, male and a maths undergrad at Cambridge University. I was linked to LW a little over a year ago, and despite having initial misgivings for philosophy-type stuff on the internet (and off, for that matter), I hung around long enough to realise that LW was actually different from most of what I had read. In particular, I found a mix of ideas that I’ve always thought (and been alone amongst my peers in doing so), such as making beliefs pay rent; and new ones that were compelling, such as the conservation of expected evidence post.

I’ve always identified as a rationalist, and was fortunate enough to be raised to a sound understanding of what might be considered ‘traditional’ rationality. I’ve changed the way I think since starting to read LW, and have dropped some of the unhelpful attitudes that were promoted by status-warfare at a high achieving all-boys school (you must always be right, you must always have an answer, you must never back down…)

I’m here because the LW community seems to have lots of straight-thinking people with a vast cumulative knowledge. I want to be a part of and learn from that kind of community, for no better reason than I think I would enjoy life more for it.

Comment author: palguay 10 July 2012 05:57:44AM 2 points [-]

Hi Everyone, I stumbled upon this website while reading a comment on reddit, I am a programmer living in India , I came back to India in march after living in the US for 6 years.

I am interested in cognitive psychology and have started working on a pet project of mine to implement the various cognitive tasks available on commercial websites in my own website http://brainturk.com .

I hope to contribute to some discussions and learn from others here.

Comment author: Adriano_Mannino 04 July 2012 01:23:15AM *  11 points [-]

Hi all, I'm a lurker of about two years and have been wanting to contribute here and there - so here I am. I specialize in ethics and have further interests in epistemology and the philosophy of mind.

LessWrong is (by far) the best web resource on step-by-step rationality. I've been referring all aspiring rationalists to this blog as well as all the people who urgently need some rationality training (and who aren't totally lost). So thanks, you're doing an awesome job with this rationality dojo!

Comment author: TheEleaticStranger 03 July 2012 01:48:17AM 5 points [-]

Hi, I am interested in the neurobiology of decision-making and rationality and happened to stumble upon this site and decided to join.

-Cheers.

Comment author: WingedViper 01 July 2012 09:15:28AM 6 points [-]

Hi,

I'm a German student-to-be (I am going to start studying IT in October) and I am interested in almost anything connected with rationality, especially the self improvement, biases and "how to save the world" parts. I hope that lesswrong will be (and it already has been to a certain amount) one of the resources for (re-)shaping my thinking and acting towards a better me and a better world.

I came here, like so many others ;-), because I wanted to check out the foundations/concepts behind HPMOR and I could not just leave again. So over the last few months I visited again and again to read some of the sequences and posts.

As I am interested in science, especially physics, maths, technology and astronomy, I have a question that I would like to ask the lesswrong community: What is a fast and secure way of determining the trustworthiness of scientists and scientific papers? I ask this because there is a lot of pseudoscience and poorly done science out there which often isn't easy to distinguish from unconventional/disrupting science (at least not for me).

all the best Viper

Comment author: monkeywicked 25 June 2012 09:23:25PM 5 points [-]

Hi.

I'm a fiction writer and while I strive towards rationalism in my daily life, I can also appreciate many non-rational things: nonsensical mythologies, perverse human behaviors, and the many dramas and tragedies of people behaving irrationally. My criteria for value often relates to how complex and stimulating I find something... not necessarily how accurate or true it may be. I can take pleasure in ridiculous pseudo-science almost as much as actual science, enjoy a pop-science theory as much as deep epistemology, and I can find a hopelessly misguided person to be more compelling and sympathetic than a great rationalist.

However, conveniently, it often turns out that the most interesting stories, the most mind-bending concepts, and the most impressive acts of creativity are born of rationalist thinking rather than pure whimsy. And so I can have my cake and eat it too, because the posts at LW are as likely to create the sensation of mental expansiveness that I associate with great fiction (or, I suspect, compelling theology) while also attempting to be, uh, you know, less wrong.

So it's fun to be here. And if it helps me think and experience the world more clearly and critically... that's gravy.

Recently I've been working on several sci-fi writing projects that involve topics that are discussed at LW. One is about the development of AI and one about the multi-world interpretation. Neither project is 100% "hard sci-fi," however I would ideally like them to be not totally stupid... since I think plausibility and accuracy often produce narrative interest--even if plausibility and accuracy are not, in of themselves, objectives. After doing a lot of research on the topics, I still have many questions. It seems to me that the LW community might be the best place to get clear, smart, informed answers in layman's terms.

I'll fire away with a couple questions and see what happens. If this works out, I'll probably have a lot more...

(I wasn't sure if these ought to be a comments at And the Winner Is... Many World If so, I can re-post there.)

  1. In the MWI its often suggested that anything that could have happened will have happened. Thus, quantum immortality, etc. But this often puzzles me. Just because there are infinite worlds, why should there be infinite diversity of worlds? You could easily create infinite worlds by simply moving a single atom around to an infinite number of locations... but those worlds would be essentially identical. If Everett's chance of surviving each year is 100 - 1% for every year he lives, then wouldn't that mean his chance of being dead at 100 is 100%? Wouldn't that mean he's dead in all worlds? If you send an infinite number of light photons through the double slit their infinite possible locations on the wall are extremely limited. Couldn't the many worlds of the MWI resemble infinite photons being sent through the a double-slit experiment? Infinite in number, but extremely constrained in result?

  2. Is it possible, within the MWI, to have a situation where all but one world experiences some event? E.g. event X happens at time 2 in world 2, time 3 in world 3 and so on so that X appears at some time in every world except world 1. Now say that X is a Vacuum Decay event... wouldn't that mean it is possible to only have ONE viable, interesting world even within the MWI?

  3. David Deutsch, in The Fabric of Reality, claims that a quantum computer running Shor's Algorithm would be borrowing computational power from parallel worlds since there isn't enough computational power in all of our universe to run Shor's Algorithm. Does anyone know what would be happening in the worlds that the computer is borrowing the computational power from? Would those worlds also have to have identical computers running Shor's Algorithm? Or is there some more mysterious way in which a quantum computer can borrow computational power from other worlds?

  4. Is there any hypothetical, theoretical, or even vaguely plausible way for an intelligent being in one world to gain information about the other worlds in the MWI? Interference takes place constantly between particles in our world and other worlds; is there any way to for this interference to be turned into communication or at least advanced speculation about the other worlds? Or is such a notion pure fantasy?

Thanks in advance! If anyone can answer any of these or redirect me to resources inside/outside of LW, I'd be grateful.

Cheers,

MW

Comment author: pragmatist 25 June 2012 10:04:07PM *  3 points [-]

Welcome to LessWrong! Here are some answers to your questions about MWI:

  1. The space of possibilities in MWI is given by the configuration space of all the particles in the universe. The configuration space consists of every possible arrangement of those particles in physical space. So if a situation can be realized by rearranging the particles, then it is possible according to MWI. There is a slight caveat here, though. Strictly speaking, the only possibilities that are realized correspond to points in configuration space that are, at some point in time, assigned non-zero wavefunction amplitude. There is no requirement that, for an arbitrary initial condition and a finite period of time, every point in configuration space must have non-zero amplitude at some point during that period. Anyway, thinking in terms of worlds is actually a recipe for confusion when it comes to MWI, although at some level it may be unavoidable. The imporant thing to realize is that in MWI "worlds" aren't fundamental entities. The fundamental object is the wavefunction, and "worlds" are imprecise emergent patterns. Think of "worlds" in MWI the same way you think of "blobs" when you spill some ink. How much ink does there need to be in a particular region before you'd say there's a blob there? How do you count the number of blobs? These are all vague questions.

  2. MWI does not play nicely with quantum field theory. The whole notion of a false vacuum tunneling into a true vacuum (which, I presume, is what you mean by vacuum decay) only makes sense in the context of QFT. The configuration space of MWI is constructed by considering all the arrangements of a fixed number of particles. So particle number is constant across all worlds and all times in configuration space. Unlike QFT, particles can't be created or destroyed. So the configuration space of a zero-particle world would be trivial, a single point. If you have more than one particle then all the worlds would have to have more than one particle. None of them would be non-viable or uninteresting. Perhaps it is possible to construct a version of MWI that is compatible with QFT, but I haven't seen such a construction yet.

  3. Deutsch's version of MWI (at least at the time he wrote that book) is different from the form of MWI advocated in the sequences. According to the latter, "world-splitting" is just decoherence, the interaction of a quantum system with its environment. But a quantum computer will not work if it decoheres. So according to this version of MWI, in order for a quantum computer to work, we need to make sure it doesn't split into different worlds. Instead, we would have a quantum computer in a superposed state within a single world, which I guess you can think of as many overlapping and interfering computers embedded in a single larger world. So you're not really harnessing the computational power of other worlds.

  4. On an appropriate conception of "worlds", interference does not take place between particles in our world and other worlds. Interference effects are an indication of superposition in our world, a sign of a quantum system that hasn't decohered. Decoherence destroys interference. It is possible for there to be interference between full-fledged worlds (separate branches of a wave function large enough to contain human beings), but it is astronomically unlikely. You can communicate with other worlds trivially, as long as those worlds are ones which will split off from your world in the future. But otherwise, you're out of luck.

Comment author: Zack_M_Davis 25 June 2012 09:40:19PM *  1 point [-]

However, conveniently, it often turns out that the most interesting stories, the most mind-bending concepts, and the most impressive acts of creativity are born of rationalist thinking rather than pure whimsy.

Yes; I like Steven Kaas's explanation:

Truth is more interesting than fiction because it's connected to a larger body of canon.

Comment author: Evercy 24 June 2012 01:06:37AM 3 points [-]

Hello!

I am a university student studying biology in Ontario. I've actually known about lesswrong for a few years before I joined. My good friend likes to share interesting things that he finds on the internet, and he has linked me to this site more than once. Over time, lesswrong has grown increasingly relevant to my interests. Right now, I'm mainly reading posts and dabbling in the sequences. But I hope that I will be able to contribute some ideas in posts or comments once I get used to how things work around here. Some things that interest me are rhetoric, anthropology, software engineering, cloning and transhumanism. Oh, and biology of course, since that is my field of study (but something about NEEDING to study it, instead of voluntarily doing so, diminishes my enthusiasm for it haha). I hope I'll get you know all better!

Comment author: Jost 23 June 2012 03:13:17PM 5 points [-]

Hey everyone,

I'm Jost, 19 years old, and studying physics in Munich, Germany. I've come across HPMoR in mid-2010 and am currently translating it into German. That way, I found LW and dropped by from time to time to read some stuff – mostly from the Sequences, but rarely in sequence. I started reading more of LW this spring, while a friend and I were preparing a two day introductory course on cognitive biases entitled “How to Change Your Mind”. (Guess where that idea came from!)

I'm probably going to be most active in the HPMoR-related threads.

I was very intrigued by the Singularity- and FAI-related ideas, but I still feel kind of a future shock after reading about all these SL4 ideas while I was at SL1. Are there any remedies?

Comment author: [deleted] 20 June 2012 08:32:07AM 4 points [-]

Hey LW community. I'm an aspiring rationalist from the Bay Area, in CA, 15 years old.

I found out about this site from Harry Potter and the Methods of Rationality, and after reading some of the discussions, I decided to become a member of the community.

I have never really been religious at any time of my life. I dismissed the idea of any kind of god as fiction around the same time you would find out that Santa isn't real. My family has never been very religious at all, and I didn't even find out they were agnostic until I recently. That said, I would consider myself an atheist, because I don't have any doubts that there is no god.

I look forward to being a part of this community, and learning more about rationalism.

Comment author: GESBoulder 14 June 2012 12:31:06AM *  5 points [-]

Hello to the LW Community. My name is Glenn, 49, from Boulder, Colorado. After completing my Master's degree in Economics, I began a career in investment management, with a diversion into elected politics (a city council, a regional council of governments, then the Colorado state legislature, along with corporate on non-profit boards). My academic work focused on decision theory and risk analysis and my vocation on their practical application. Presently, I manage several billion dollars' worth of fixed-income portfolios on behalf of local governments and non-profits across the United States. I've also worked with the U.S. government doing training for centrist, pro-democracy parties in the emerging world.

My path to you was through a Youtube interview of Steve Omohundro. My path to him was general background research on AI, space exploration, energy, computer science and nanotech in my sometimes seemingly vain attempt to keep pace with the accelerating change in the world.

My beliefs on what is left of my religion, albeit starting off half way gone as a Presbyterian, after subjecting it to astrophysics (my original undergrad major), evolution, Jung, critical analysis of the Bible, skepticism, Lucifer (as in the light baring meme for the enlightenment and American Revolution), objectivism, experience, rationalism, is well outside of orthodoxy, say, Christian humanism. I remain very skeptical of the genius of anyone or any group to plan or scheme or act as a virtuous vanguard. I believe that power is best defused.

I bring to the table experience and knowledge of economics, finance, politics and public policy formation. I'll do a lot of deferring on other subjects. I think the work here on rationalism and at SI is of critical importance. You all have my highest regard. I too look forward to your influences on me becoming less wrong.

Comment author: Vaniver 14 June 2012 12:56:27AM 1 point [-]

Welcome! We're glad to have you.

Comment author: Blackened 12 June 2012 12:59:38AM 3 points [-]

Hello, LessWrong. I'm 20 years old, originally from Bulgaria, living and studying Software Engineering in London (just finished my 1st year). I have always wanted to know a lot about human thinking, because of my need to be as optimal as possible plus my interest in technical things plus my tendency to seek rigorous explanations. I still have a deep interest in psychology and I see some potential very powerful applications I'll feel inefficient without. The second thing I love is programming.

As a rationalist, I'm very strict to myself. I always go for the expected outcome, which usually brings me to sacrifice whatever brings short-term pleasure and happiness in favor of self-improvement (my time-management is too created with this in mind). Despite that, I'm usually quite happy in life. My ideal for spending my day is reading and studying and practicing programming, maybe exercising. Unfortunately, I can't even spend half of my day so efficiently because of procrastination (btw I'm writing this in the efficient parts of today :D), but I'm gradually overcoming it, and I'm putting a lot of effort to battle it. While still battling it, I can use LessWrong a lot, as it's productive and fun - hopefully it'll replace less efficient activities.

Google brought me here - I was reading Heuristics and Biases: The Psychology of Intuitive Thinking (2002) and I sought additional information on a certain thingy. It was then when I saw this community and my heart started beating fast - I already had my own idea of rationalism and I knew a few people who follow it and own it as much as I do (they are also my closest friends). Eventually, I found my idea to be more extremely rational than this community's idea. I enjoyed Yudkowsky's Harry Potter a lot and I'm quite similar to Harry Potter, although there are many cases where I consider his actions to be irrational (I'm quite convinced that the author is aware of those, as some of them can even be explained by simple biases) - despite this, I'm very much looking forward to the latest chapters.

I am currently looking forward to meeting any rationalist (online), as I'm looking for exchange of information and I always have tons of questions, and rationalists are expected to have many of the answers I'm seeking, some of them are so hard to get. I have useful information to share as well.

I will also post in "tell your rationalist story".

Comment author: Lukas_Gloor 10 June 2012 10:30:20PM *  2 points [-]

Hi! I discovered LW about a year ago and now I actually created an account. I study philosophy, and biology as minor. Sometimes I'm rather shocked by the things my fellow students believe and how they argue for their beliefs; I wish something like LW would be part of the standard curriculum. My main interests are ethics, philosophy of mind and evolutionary biology, and I'm looking forward to participating in discussions on these issues. Especially on ethics, as I'm skeptical regarding some of the views advocated on here (I'm a utilitarian). As someone who had read the original books several times, I was also delighted to find out about HPMoR recently.

Comment author: Nighteyes5678 08 June 2012 08:03:57PM 4 points [-]

Hey all. i figured that after a few long months of lurking, I might as well introduce myself (that way when I post elsewhere, someone doesn't feel obligated to smack my nose politely with a rolled-up newspaper and send me here), even though I can never figure out what to say.

I've now finished all the Sequences and I've successfully resisted the urge to argue with comments that are years old, and I think I've learned a lot. One of the high moments was that I had just finished reading the Zombie sequence when I met a friend of a friend, who started to postulate the Zombie world and concept. Thanks to my reading here, I'd already done some thinking about the matter and could engage with him intelligently. How awesome is that?

One of my biggest struggles is coming up with how some of the stuff on Lesswrong is applicable to normal life. I'm not a IA researcher, I get confused with computers, and I'm a fairly normal person. I'm into the outdoors, writing (dream job, right there), teaching, history, and board games. A lot of times, then, I wish the Sequences had parts after each post that suggested ways that the principles impacted normal life. Trying to figure out how to connect the Bayes way to more normal decisions is challenging. Perhaps this is already been addressed - Lesswrong is also a labyrinth for newbies. ^_^

As far as posting go, I'm still finding the right line between investigating and defensive/aggressive. Generally, I'm impossible to offend and I don't take things personally. I'll try and live that creed as well as just say it, but now it's on record. I also believe strongly in giving someone the benefit of the doubt, or taking their statement in the best possible light.

I'm not sure what else to say, but if there's one thing I've learned here, it's that people are always happy to point out areas that are lacking in both information and depth. Hope to see y'all around and I'm looking forward to exploring various things with awesome folk.

Comment author: Paul_G 08 June 2012 02:19:27AM 5 points [-]

Hi! My name is Paul, and I've been an aspiring rationalist for years. A long time ago, I realized implicitly that reality exists, and that there is only one. I think "rationality" is the only reasonable next thing to do. I pretty much started "training" on TvTropes, reading fallacies and the like there, as well as seeing ways to analyze things in fiction. The rules there apply to real life fairly well.

From there, I discovered Harry Potter and the Methods of Rationality, and from there, this site. Been reading quite a bit on and off over the past little while, and decided to become a bit more active.

Just visited a meetup group in Ottawa (which is about a 2 hour drive), and I no longer feel like the only sane man in the world. Meeting a group of Bayesian rationalists was incredibly enlightening. I still have a lot to learn.

Comment author: [deleted] 07 June 2012 07:58:29PM 4 points [-]

Hey guys. My name is Michael and I'm a business student living in Little Rock, Arkansas. I've recently become fascinated by the work of SI and I'm interested in participating any way I can. I've long considered myself a rationalist after I abandoned religion in my teens. However lately I realized I need to interact with other rationalists in order to further my development. I'm considering trying to attract more LessWrong members from where I live. If anybody has any advice concerning that I'd be happy to hear it.

Comment author: steven0461 07 June 2012 08:13:53PM 1 point [-]

Welcome to LessWrong! It sounds like you may want to organize a meetup in your town if there isn't one already.

Comment author: [deleted] 07 June 2012 08:37:54PM 1 point [-]

Thank you! Yes I've read about those. Unfortunately there are none in Arkansas. I've been thinking about advertising around campus.

Comment author: prashantsohani 01 June 2012 05:20:49PM 5 points [-]

Hello, everyone! I'm 21, soon to graduate from IIT Bombay, India. I guess the first time I knowingly encountered rationality, was at 12, when I discovered the axiomatic development of Euclidean geometry, as opposed to the typical school-progression of teaching mathematics. This initial interest in problem-solving through logic was fueled further, through my later (and ongoing) association with the Mathematics Olympiads and related activities.

Of late, I find my thoughts turning ever more to understanding the working and inefficiencies of our macro-economy, and how it connects with basically human thought and behavior. I very recently came to know of Red Plenty, which seems generally in line with the evolutionary alternative described in the foreword to Bucky Fuller's Grunch of Giants.. and that is what made me feel the need to come here, actively study and discuss these and related ideas with a larger community.

Having just started with the Core Sequences, looking forward to an enriching experience here!

Comment author: Stuart_Armstrong 01 June 2012 05:38:49PM 1 point [-]

Well welcome, and hope you find yourself happy and interested here!

Comment author: Lykos 30 May 2012 08:00:47PM 8 points [-]

Hello, everyone. I'm Lykos, and it's a pleasure to finally be posting here. I'm a high school junior and I pretty much discovered the concept of rationality through HP:MoR. I'm not sure where I discovered THAT. I'm an aspiring author, and am always eager to learn more, and rationality, I've found, has helped me with my ideas, both for stories and in general. I've currently read the Map and Territory sequence, and am going through Mysterious Answers to Mysterious Questions. I doubt I'll be posting much- I'll probably be spending most of my time basking in the intelligence of the rest of you.

Either way, it is a pleasure to join the community. Thank you.

Comment author: Worthstream 30 May 2012 03:24:21PM *  2 points [-]

Hi, Worthstream here. I'm from Italy, as you will no doubt notice from my unusual choice of words. (Europeans then to overuse latin derived words in my experience)

I'm graduated in computer science, currently working as a web programmer, the kind of technical background i think is quite common here, judging by the number of useful applets and websites built by community members (Beeminder, just to name the first that comes to mind).

I'm a regional coordinator of the italian Mensa, a society i joined thinking that i would have found a lot of rational people. That assumption has been proved false, mensa members are not appreciably more rational than the rest of the population.

While i usually do not like fanfiction neither Harry Potter, HP:MoR is one of the best book i've read. I'm actively trying to get my friends to read it.

If i remember correctly i've found LW by looking for akrasia and time management advices, since i'm really interested in self improvement. I remember reading some articles i found interesting, started following the link to other posts, and the link in those posts too... and suddenly i did have an enormous backlog of articles to read!

Comment author: TheOtherDave 30 May 2012 04:52:41PM *  3 points [-]

found LW by looking for akrasia and time management advice [...] and suddenly i did have an enormous backlog of articles to read!

* raises finger *
* opens mouth *
* closes mouth *
* lowers finger *

Hi, Worthstream. Welcome to LW.!
Yeah, CS backgrounds are pretty common here, as is being disappointed by Mensa, liking HP:MoR, and an ongoing struggle with managing the shiny distractions of the Internet.

Comment author: witzvo 27 May 2012 12:17:50AM 4 points [-]

You can call me Witzvo. My determination of whether I'm a "rationalist" is waiting on data to be supplied by your responses. I found HPMOR hilarious and insightful (I was hooked from the first chapter which so beautifully juxtaposed a rationalist child with all-too-realistic adults), and lurked some for a while. I have one previous post which I doubt ever got read. To be critical, my general impression of the discussions here is that they are self-congratulatory, smarter than they are wise, and sometimes obsessed with philosophically meaningful but not terribly urgent debates. However, I do value the criteria by which karma is obtained. And I saw some evidence of responses being actually based on the merits of an argument presented, which is commendable. Also, Eliezer should be commended for sticking his neck out so far and so often.

I was born into a sect of Christianity that is heretical in various ways, but notably in that they believe that God is operating all for the (eventual) good of mankind, and that we will all be saved (e.g. no eternal Hell). I remain agnostic. Talk about non-falsifiability and Occam's razor all you like, but a Bayesian doesn't abandon the possibilities to which he assigns prior mass without evidence, and even then the posterior mass generally just drops towards 0, not all the way. Still, my life is basically secular; I don't think there's an important observable difference in how I live my life from how an atheist lives, and that's pretty much the end of the matter for me. Oh, perhaps I have times of weakness, but who doesn't?

I have formal training in statistics. I am very sympathetic to the Savage / de Finetti schools of subjective Bayesianism, but if I had to name my philosophy of science I'd call it Boxian, after George Box (c.f. http://www.jstor.org/stable/2982063; I highly recommend this paper AND the discussion. Sorry about the pay walls).

I find the Solomonoff/Kolmogorov/AIXI ideas fascinating and inspiring. I aspire to compute for example, (a computationally bounded approximation to) the normal forms of (a finite subsequence of) a countable sequence of de Bruijn lambda terms and go from there. I do not see any lurking existential crisis in doing so.

In fact, maybe I've missed something, but I have not yet identified an actionable issue regarding one of the much-discussed existential crises. I do not participate much in the political system of my country or even see how that would help particularly except and unless through actual rational discussion and other action.

I find far more profit in exploring ideas, such as say, Inventing on Principle (http://vimeo.com/36579366), or Incremental Learning in Inductive Programming (http://www.cogsys.wiai.uni-bamberg.de/aaip09/aaip09_submissions/incremental.pdf), either of which I would be happy to discuss.

I am also intellectually lonely.

That's probably more than enough. Go on and tell me something less wrong.

Comment author: [deleted] 29 May 2012 06:47:16AM *  2 points [-]

Well, the standard response to the whole 'agnostic' debate is that while probability is subjective, pobability theory is theorems: You and I are only ever allowed to assign credence according to the amount of evidence available, and the God hypothesis has little, so we believe little. This gives me the mathematical right to make the prediction "the Jeudo-Christian God does not exist" and expect to see evidence accordingly. We say ~God because that is what we expect.

Other than that, welcome to less wrong. If you have time to read a book draft significantly longer than The Lord of the Rings Trilogy, written in blog posts, I reccomend reading the sequences in chronological order (use the article navigation at the bottom).

Comment author: CWG 29 May 2012 05:49:45AM *  1 point [-]

Carl Sagan described himself as agnostic, and it's a rational position to hold. As Sagan said:

"An atheist is someone who is certain that God does not exist, someone who has compelling evidence against the existence of God. I know of no such compelling evidence. Because God can be relegated to remote times and places and to ultimate causes, we would have to know a great deal more about the universe than we do now to be sure that no such God exists. To be certain of the existence of God and to be certain of the nonexistence of God seem to me to be the confident extremes in a subject so riddled with doubt and uncertainty as to inspire very little confidence indeed".

However, I personally attach zero likelihood to anything like the Christian, Muslim, Jewish or Hindu god or gods existing. Technically I might be an agnostic, but I think "atheist" represents my outlook and belief system better. Then again, "a-theism" is defined in terms of what it doesn't believe. I prefer to minimize talking about atheism, and talk about what I do believe in - science, rationality and a naturalistic worldview.

Comment author: [deleted] 29 May 2012 02:58:32PM *  5 points [-]

0 and 1 are not probabilities anyway, so refusing to call someone an atheist (or a theist) because they assign a non-zero (or ‘non-one’) probability to a god existing seems pointless to me, because then hardly anyone would count as atheist (or a theist). (It's also a fallacy of gray, because assigning 0.1% probability to a god existing is not the same as assigning 99.9% probability to that.)

Comment author: blackhole 25 May 2012 10:00:38PM 4 points [-]

Hello everyone. I’ve joined this site because I have a goal of being a very rational person. Intelligence and logic are very important to me. Actually I have spent many years seeking truth and reality. Probably the same as everyone else spending time here. I’m not here to prove anything but rather to learn and have my own ideas tested and checked. I’m hoping to remember the rules and etiquette so that I don’t come across the wrong way ( very easy to do when not face to face ) or waste any ones time. I’m a family man who is concerned about my children’s future because of the swift pace of technological change and its resultant social effects. For example, the smartphone phenomena and the increased socialization this allows. Entranced texters on an unrelenting zombie like invasion makes one ask, what the hell is going on here? To me, it’s an emotional issue that is detracting from intellectual growth and the evolution of intelligence. Is it the fall of Rome? Can a few brilliant minds come up with the tools (A.I ?) that will change the masses from heading blindly down a path leading to destruction ? Help required !

Comment author: CWG 25 May 2012 09:23:20AM *  6 points [-]

Greetings! I joined under my usual username a little while ago, that I use everywhere on the web. Then I realized - this is very public, and I'd rather not worry about potential clients or employers drawing conclusions from what I write about my akrasia, poor planning, depression or anything like that. So here's the version of me that's slightly less connected to my real life identity.

Very briefly:

  • I feel pretty much at home here.
  • Rationality is awesome.
  • HP:MOR is not only awesome, it's also my favorite Harry Potter book by a long way.
  • Rationality has not always helped me in having happy relationships. But sometimes it has.
  • I'm a former Christian, and though it had many benefits, the useful part of what I learned in 9 years could be compressed into a part-time course of a few months, without the superstitious stuff.
  • I struggle with planning and focus - I often have no sense of time.
  • I could probably be described with terms like akrasia, ADD and executive dysfunction, and maybe even Aspergers, aka high-functioning autistic. I'm not throwing the terms around lightly - a counselor suggested I had ADD (and it makes sense) and a number of people in my family (grandfather, brother, nephew) show many of the signs of high-functioning autism.
  • I work with a non-profit that I'm passionate about, but I want to be much more effective.
  • I have a discussion question I want to post about project management tools, but I don't have the points. I'd just passed the 20 points needed on my old account, but I'm back to zero as "CWG". Upvotes will make me smile :-).
Comment author: Jakinbandw 23 May 2012 09:55:22PM 2 points [-]

Hello. I come from HPMoR. I identify as Christian, though my belief and reasons for belief are a bit more complex than that. I'll probably do a post on that later in 'how to convince me 2+2=3'. I also get told that I over think things.

Anyway, that's not the reason I joined. I was reading an article by Eliezer Yudkowsky and he stated that whatever can be destroyed by truth should be. This got me wondering in what context that was meant. My first thought was that it meant that we should strive to destroy all false beliefs, which has the side effect of not lying, but then I began to wonder if it wasn`t more personal. We should strive to let the truth that we observe destroy any beliefs that they are able to.

I realized that the difference between the two is that one is an end in and of itself (destroy all false belief), and one is a means to achieve a goal more effectively (don`t hold on to false belief when it has been proved false). I am really not sure how I feel about the first one, it seems very confrontational to no good purpose. There are a lot of false beliefs out there that people hold dear. However the second one is strange as well.

One of peoples goals is to be happy. Now there is an old saying that ignorance is bliss. While this is definitely not always a good policy I can think of several cases off the top of my head were a person would be happier with a false belief than with reality. For example what if everything that is happening to you right now is your mind constructing an elaborate fantasy to stop you from realizing that you are slowly being tortured to death? If you break free of said belief you are not happy, and you can do nothing to save yourself. The goal of being happy is actively opposed by the goal of learning the truth. [disclaimer: I've read about the mind constructing such fantasies in books and have experienced it only once in my life to a limited degree when I was being beaten up as a child. I don't know how scientifically accurate they are. This is just an example and if necessary I can come up with another one.]

So probably that wasn't what Mr. Yudkowsky meant when he said that what can be destroyed by truth should be (and if it is, can someone explain to me why?). So what does it mean? I've run out of theories here.

Comment author: Jakinbandw 25 May 2012 11:28:24PM 2 points [-]

Just a minor update. This thread has grown to big for me to follow easily. I am ready every post in it, but real life is taking up a lot of my time right now so I will be very slow to reply. I found the limit of multiple conversations I can hold at one time before I get a headache, and it appears to be less than I suspected.

Once again, sorry, didn't mean to drop out, but I stayed up way to late and even now I am recovering from sleep deprivation and still have an annoying headache. My body seems to want to wake up 2 hours before it should. I'll be back once I get my sleeping back to normal, and get some more time. Even then though I am going to try to limit myself to only a couple posts a day because while I enjoy discussions, it's very easy for me to forget everything else when I get drawn into them.

I'll be back later. JAKInBAndW

Comment author: Bugmaster 26 May 2012 12:14:01AM 1 point [-]

Don't sweat it, I don't think anyone here expects you to answer all posts in an extremely rapid fashion. Ok, maybe some do, but you don't owe those people anything, anyway. This is a discussion site, not a job :-)

Comment author: CWG 25 May 2012 10:06:26AM 1 point [-]

Welcome.

Getting beaten up as a child sucks. Hope your life is a whole lot better now.

A somewhat related personal story: I was a Christian. I was plagued by doubts, and decided that I wanted to know what the truth was, even if it was something I didn't want to believe. I knew that I wanted Christianity to be true, but I didn't want to just believe for the sake of it.

So I started doing more serious reading. Not rationalist writings, but a thoughtful theologian and historian, NT Wright, who I've also seen appear on documentaries about New Testament history. I read the first two in what he was planning as an epic 5 part series: "The New Testament and the People of God" and "Jesus and the Victory of God".

I loved the way he explained history, and how to think about history (i.e. historiography). Also language, and ideas about the universe. He wrote very well, and warmly - you got the sense that this was a real human being, but he lacked the hubris that I'd often found in religious writers, and he seemed more interested in seeking truth than in claiming that he had it. He was the most rationalist of Christian writers that I came across.

In the end, the essence of his argument seemed to be that there is a way of understanding the Bible that could tell us something about God - if we believe in a personal god who is involved in the universe... and that if we believe in that kind of god, described in the Old Testament, then the idea of taking human form, and becoming the embodiment of everything that Israel was meant to be, does make sense. (He went into much, much more depth here about , and I can't do him justice at all, 15 years after I read it.) He didn't push the reader to believe - he just stated that it was something that made sense to him, and he did believe it.

He painted a picture and told a story which I found very appealing, to be honest. But in the end it didn't fit with how I understood the universe, based on the more solid ground of science.

I finally accepted that - my increasingly shaky belief was destroyed. It was hard, and I was upset - I'd been finding life hard, personally, and my beliefs were the framework that I'd used to attempt to make sense of things, such as an unhappy childhood and the death of both parents as a young adult. But I also felt freed, and after a couple of weeks, it didn't seem so bad. Years later, I'm much happier, and couldn't imagine myself as a Christian.

That's where I see the value personally in destroying false beliefs - I was freed to live without the restrictions imposed by a false belief system. The restrictions, in many cases, didn't have any sound basis outside the belief system, and I was better without them. There were positive aspects of Christianity, but I didn't need the beliefs to hold onto what I'd learnt about being compassionate and understanding, or about the value of community.

I felt that NT Wright told an honest, complex and interesting story, but in terms the reality (or non-reality) of a god, he made an intuitive judgement which I don't see as sound (and which was different from my own intuition). But he helped me think things through at a time when I wasn't getting satisfactory answers from other Christians, and I really enjoyed his writing. I might even go back and read him some day.

That's wide of the topic, I know, but it's kind of relevant, and a welcome thread seems like a good place to go on tangents :-).

Comment author: TimS 24 May 2012 12:43:46AM 4 points [-]

Welcome to LessWrong. There's a sizable contingent of people in this community who don't think that uncomfortable truths need be confronted. But I think they are wrong.

As you say, one purpose of believing true things is to be better at achieving goals. To exaggerate slightly, if you believe "Things in motion tend to come to a stop," then you will never achieve the goal of building a rocket to visit other planets. You might respond that none of your actual goals are prevented by your false beliefs. But you can't know that in advance unless you know which of your beliefs are false. That's not belief, that's believing that you have a belief.. And adjusting your goals so that they never are frustrated by false beliefs is just a long-winded way of saying Not Achieving Your Original Goals.

In theory, there might be a time when you wouldn't choose differently with a true belief that with a false belief. I certainly don't endorse telling an imminently dying man that his beloved wife cheated on him years ago. But circumstances must be quite strange for you to be confident that your choices won't change based on your beliefs. You, the person doing the believing, don't know when you are in situations like that because - by hypothesis - you have an unknown false belief that prevents you from understanding what is going on.

Comment author: electricfistula 23 May 2012 10:36:14PM *  3 points [-]

Hi, I joined just to reply to this comment. I don't think there is a lot of complexity hidden behind "whatever can be destroyed by truth should be". If there is a false belief, we should try to replace it with a true one, or at least a less wrong one.

Your argument that goes "But what if you were being tortured to death" doesn't really hold up because that argument can be used to reach any conclusion. What if you were experiencing perfect bliss, but then, your mind made up an elaborate fantasy which you believe to be your life... What if there were an evil and capricious deity who would torture you for eternity if you chose Frosted Flakes over Fruit Loops for breakfast? These kinds of "What if" statements followed by something of fundamentally unknowable probability are infinite in number and could be used to reach any conclusion you like and therefore, they don't recomend any conclusion over any other conclusion. I don't think it is more likely that I am being horribly tortured and fantasizing about writing this comment than I think it is likely that I am in perfect bliss and fantasizing about this, and so, this argument does nothing to recomend ignorance over knowledge.

In retrospect (say it turns out I am being tortured) I may be happier in ignorance, but I would be an inferior rationalist.

I think this applies to Christianity too. At the risk of being polemical, say I believed that Christianity is a scam whereby a select group of people convince the children of the faithful that they are in peril of eternal punishment if they don't grow up to give 10% of their money to the church. Suppose I think that this is harmful to children and adults. Further, suppose I think the material claims of the religion are false. Now, you on the other hand suppose (I assume) that the material claims of the religion are true and that the children of the faithful are being improved by religious instruction.

Both of us can't be right here. If we apply the saying "whatever can be destroyed by truth should be" then we should each try to rigorously expose our ideas to the truth. If one of our positions can be destroyed by the truth, it should be. This works no matter who is right (or if neither of us are right). If I am correct, then I destroy your idea, you stop believing in something false, stop assisting in the spread of false beliefs, stop contributing money to a scam, etc. If you are right then my belief will be destroyed, I can gain eternal salvation, stop trying to mislead people from the true faith, begin tithing etc.

In conclusion, I think the saying means exactly what it sounds like.

Comment author: Bugmaster 23 May 2012 11:09:36PM 2 points [-]

These kinds of "What if" statements followed by something of fundamentally unknowable probability...

Minor nitpick: these statements have a very low probability of being true due to the lack of evidence for them, not an unknowable probability of being true as your sentence would imply.

This works no matter who is right (or if neither of us are right).

Ok, but what about unfalsifiable (or incredibly unlikely to be falsified) claims ? Let's imagine that I am a religious person, who believes that a). the afterlife exists, and b). the gods will reward people in this afterlife in proportion to the number of good deeds each person accomplished in his Earthly life.The exact nature of the reward doesn't matter, whatever it is, I'd consider it awesome. Furthermore, let's imagine that I believe c). no objective empirical evidence of this afterlife and these gods' existence could ever be obtained; nonetheless, I believe in it wholeheartedly (perhaps the gods revealed the truth to me in an intensely subjective experience, or whatever). As a direct result of my beliefs, d). I am driven to become a better person and do more good things for more people, thus becoming generally nicer, etc.

In this scenario, should my belief be destroyed by the truth ?

Comment author: electricfistula 23 May 2012 11:45:23PM *  2 points [-]

Suppose we are neighbors. By some mixup, the power company is combining my electric bill to your own. You notice that your bill is unusually high, but you pay it anyway because you want electricity. In fact, you like electricity so much that you are happy to pay even the high bill to get continued power. Now, suppose that I knew all the details of the situation. Should I tell you about the error?

I think this case is pretty similar to the one you've described about the religion that makes you do good things. You pay my bill because you want a good for yourself. I am letting you incur a cost, that you may not want to, because it will benefit me.

I think in the electricity example I have some moral obligation to tell you our bills have been combined. I think this carries over to the religious example. There is a real benefit to me (and to society) to let you continue to labor under your false assumption that doing good deeds would result in magic rewards, but I still think it would be immoral to let this go on. I think the right thing to do would be to try and destroy your false belief with the truth and then try to convince you that altruism can be rewarding in and of itself. That way, you may still be an altruist, but you won't be fooled into being one.

Comment author: beberly37 23 May 2012 04:29:13PM 3 points [-]

Hello all, it seems like it is a common enough occurrence that it no longer seem embarrassing, but I too found LW via HPMOR, which was referred to me by a friend; my eyes and neck hurt for at least a week after spending far too much time reading from a laptop. I have a BS and an MS in mechanical engineering, I have spent some time as a researcher, a high school teacher and I am currently being an actual engineer at a biodiesel plant.

Growing up everyone told me I was going to become an engineer (I was one of those kids that took apart my toys to see how they worked or try to make them better). I have been cursed, as I am sure is common at LW, that most things (at least mentally tasking things) I try are pretty easy, so I have learned not to work all that hard at anything: high school, undergrad, grad school, work. One of the best parts about LW is that this is really hard stuff, especial for one who is accustom to not having to put forth much mental effort. Yesterday I failed Wason's selection task miserably (thank you, LW, for striking me!) and it took me nearly a year of half-hearted, sporatic readings of Bayes's Theorem to finally be able to say I have moved up on Bloom's Taxonomy to at least understanding (there was a huge lack of statistics in my curricula).

After a year of lurking I decided to start posting because there are so many questions I have that I think should be asked or ideas about which I would love to hear the input from higher level rationalists and this is the obvious starting place.

Comment author: geneticsresearcher 22 May 2012 01:31:07AM 3 points [-]

Hello, everyone. It's a pleasure to be here. I look forward to participating in discussions.

Comment author: genisage 16 May 2012 08:25:56AM 2 points [-]

Hello all! I'm a student of Mathematics and Computer Science and a fan of physics, linguistics, psychology, and biology.. I found lesswrong through HPMOR. I would say that I've been a rationalist for most of my life. Cognitive biases and logical fallacies, as well as methods for recognizing them, were explained to me at a young age. Unfortunately, lately I've noticed that I'm not holding myself to the same standards of rationality that I used to, and even worse, I've noticed myself using the fact that I'm being rational as an excuse to be unpleasant. So, partially in an effort to begin reforming myself and partly in search of something to help alleviate my boredom this summer, I made an account here.

Comment author: e_c 14 May 2012 03:44:23PM 5 points [-]

Hello folks! I'm a student of computer science, found Less Wrong a few years ago, read some articles, found myself nodding along, but didn't really change my mind about anything significant. That is, until recently I came across something that completely shattered my worldview and, having trouble coping with that, I found myself coming back here, seeking either something that would invalidate this new insight or help me accept it if it is indeed true. Over the past few days, I have probably been thinking harder than ever before in my life, and I hope to contribute to discussions here in the future.

Comment author: athingtoconsider 13 June 2012 01:13:43PM 1 point [-]

What's the insight?

Comment author: GoldenWolf 13 May 2012 04:31:08PM 4 points [-]

Found HPMOR, changed my life, etc. Been reading for a couple years, and I figure it's finally time to start actually doing something. Not an academic at all. I'm in the Army and spend my free time with creative writing, but I understand most of the material, and I am capable of applying it.

I have a question that's not in the FAQ. I recently read The Social Coprocessor Model. I want to reread it again in the future without keeping a tab permanently open. There is a save button near the bottom, and I clicked it. How exactly does this work? I can't figure out how to access the post from the main page. I suppose I could always keep a document with my favorite links or clutter up my browser's favorites, but it seems stupid if there's already a system in place here.

Comment author: Randaly 13 May 2012 04:44:04PM *  2 points [-]

Welcome to LessWrong!

If you get to either the main or the discussion page by clicking on either button, you should see a smaller row of buttons immediately beneath the two big buttons ("Main" and "Discussion"). One of them should read "Saved"; if you click on that, you'll see all of the posts you've saved.

Comment author: Kindly 12 May 2012 10:30:24PM 5 points [-]

Hello!

I'm a graduate student in mathematics and came across Less Wrong by, uh, Googling "Bayes' Theorem". I've been putting off creating an account for the past month or so, because I've had absolutely no free time on my hands. Now that the semester's winding down, I've decided to try it out, although I may end up disappearing once things get going again in the fall.

Out of the posts I've read on LW so far, I'm the most impressed by the happiness and self-awareness material -- but also intrigued by the posts on math, especially probability, and will hopefully have something to contribute to those (because, well, probability is what I do). And then there's HPMOR.

We'll see what I end up doing now that I have the power to insert permanent impressions of my thoughts into the content of this website.

Comment author: Monkeymind 11 May 2012 08:56:13PM *  2 points [-]

Came here doing research on QM and decided to try out some ideas. I learn to swim best by jumping right in over my head. My style usually doesn't win me many friends, but I recognize who they are pretty fast, and I learn what works and what doesn't.

Someone once called me jello with a temper....but I'm more like a toothless old dog, more bark than bite. The tough exterior has helped me in many circumstances.

On the first day as a new kid in high school, I walked up to the biggest, baddest senior there, with all his sheep gathered around him in the parking lot, and slapped him upside his head a hard as I could. Barely had an effect.! He could have crushed my little body with one hand, but instead he laughed so hard he nearly broke a rib. No one ever messed with me because he put the word out -hands off his little buddy, and of course I also gained the reputation of one crazy SOB!

Being retired, I have a lot of time on my hands, and I am interested in learning as much as I can before I become worm food. Right now my interest is GR, QM and AI, but I don't understand what I know about it!

I have a request, I just returned from the V.A. Hospital. My doctor says I need cataract surgery.

I am having a hard time making a decision on what to do. How would Bayesian probability theorem or decision theory help me make a decision based upon the following information? If you would use this in your decision making process, I am willing to use it in mine. I'm stumped and the doctor's have given bad advice many times over the years anyways.

There are inherent risks of infection, failure and loss of eyesight. I could have my right eye done right away (it's ripe) but it could possibly wait a year. However, at that time I will need to have cataract surgery in my left eye as well (couple of weeks apart). I prefer not to have both eyes done at the same time.

An injury in 06 caused a retinal detachment in my right eye. I may be having a retinal detachment in my left eye (I am having flashing lights similar to like b4 my right eye detached). It took a couple of months before the occlusion started last time (after the flashing lights began). An occlusion is like an eclipse of grey. If it makes it all the way accross you are blind. The doctor couldn't see signs of detachment, but cautions me to get there right away if the occlusion begins. Once occlusion starts, surgery needs to happen within 24-72hours. Success diminishes rapidly after 24 hours.

I am a high risk for retinal detachment because of severe myopia (near-sightedness). The right eye surgery was pneumatic retinoplasty, and so I have increased risk of detachment or other problems with cataract surgery.

I am writing a novel and want to finish it b4 the surgeries because of potentially months downtime, and in case of problems or permanent loss of eyesight in one or more of my eyes.

The Doctor says that it is my chioce to wait up to a year, but that I need to be watchful for signs of my left eye detaching, and I don't want my right cataract to get too hard, which increases risk of detachment and lowers success rate from cataract surgery.

Thanx!

Comment author: Salemicus 10 May 2012 10:10:28PM 3 points [-]

Hi everyone. I've been lurking here for a couple of years, but decided to register so I could contribute. I work in software and am in my early 30s.

I found this site through overcomingbias, which in turn I came across through the GMU-linked economics blogs. However, I wouldn't describe myself as a rationalist - I find the discussions here interesting, but I think that, by and large, folk wisdom is pretty accurate.

I love the sequences and Eliezer's writings generally - they are what first got me reading the site, and I have been greatly enjoying following the reposts. The ones on zombies in particular have really caused me to re-evaluate my thinking.

Thanks, and look forward to meeting you all!

Comment author: Ghatanathoah 10 May 2012 07:39:21AM 6 points [-]

Hi everyone. I have been lurking since the site started, but did not have the courage to start posting until recently. I am a male college graduate in his mid-twenties, happily engaged and currently job-hunting, and have been fascinated by science and reason since I was a child. I was one of those people who actually identified with the "Hollywood Rational" robots and aliens in science fiction and wanted to be more like them. Science and science fiction socialized me and made me curious about the inner working of the universe.

I love the sequences and consider them a major influence on the way I think. The insights into reasoning, psychology, and metaethics the sequences gave me helped make me who I am today. Less Wrong made me a consequentialist and an altruist. It helped me realize that ethical naturalism might be true after all. I learned about akrasia from LW, which caused me to reject the poisonous cynicism that Revealed Preference Theory had infected me with. It's helped me put my life in order a little better, although I'm still fighting akrasia.

My only regret is that I recently started suffering bouts of severe depression because something snapped and made me start thinking about existential risks in Near Mode instead of Far Mode. I suspect it was Robin Hanson's "em" posts, which made me realize that AI could still threaten the future of the human race even if the FOOM theory turned out to be incorrect. I sometimes wish with all my heart that I could bleach the em posts out of my brain and return to a higher level of happiness, start believing in Julian Simon and the promise of the future again. But on the other hand those posts have caused me to think about certain topics much harder and more clearly than I would have otherwise.

I'm not a very prolific poster so far, but I think it's high time I started being part of the community that's been part of my life for so long.

Comment author: LordSnow 09 May 2012 05:24:03PM 6 points [-]

Hi everyone! I am still a high school student but very interested in what I read here on LessWrong! I decided to register to contribute to discussions. Until now, I have been lurking but hopefully I will be able to join the conversation in a useful way.

Comment author: Randaly 10 May 2012 04:00:40AM 1 point [-]

Hiya LordSnow! If you want to get to know some of the other LW highschoolers, we have an (inactive) Google Group, and a Facebook Group.

Comment author: John_Maxwell_IV 09 May 2012 06:05:00PM *  2 points [-]

I am still a high school student

Sorry to hear about that!

Comment author: LordSnow 09 May 2012 10:56:24PM 3 points [-]

I find your jumping to conclusions somewhat offensive. In fact, I don't feel socially disadvantaged for my interests.

Comment author: John_Maxwell_IV 10 May 2012 02:02:28AM *  1 point [-]

No! I refuse to believe that high school could be anything but a terrible prison!

runs away screaming

Comment author: CuSithBell 09 May 2012 11:00:47PM 1 point [-]

Excellent! I also find this picture of high school sorta baffling.

Comment author: Schwarmerei 08 May 2012 06:12:39PM 4 points [-]

Hi,

I am the first in a family of budding rationalists to jump in to the LessWrong waters. I got my start as a Rationalist when I was born and was influenced very heavily through my childhood by my parents' endless boxes of hard sci-fi and old school fantasy. Special mention goes to The World of Null-A (and its sequel) in introducing the notions of a worldview being 'false to facts', and a technique the main character uses (the "cortical-thalamic pause") which is very similar to "I notice that I am confused." I read everything avidly and have a mountain of books on my shelves dealing with neuroscience and cognitive biases.

The fam: I'm first in a family of six kids who have always been confused by the illogic and muddled thinking of our peers. We've all grown up strongly under the sway of the aforementioned sci-fi/fantasy collection and like nothing better than to debate topics and point out each other's fallacies or gaps in logic. We are all slightly obsessed with HPMoR (I being the only one to have read Overcoming Bias / LW before the story's inception) and I personally find that Harry's thinking often mirrors my own to an eerie degree of similarity.

Several of us are also very interested in reforming education and are forming a tech company to that end (I'm a programmer / comp jack of all trades, and my almost-twin bro is a graphic designer*). I plan on diving into the sequences more rigorously in the upcoming months, as I'd like to integrate rationalist principles into the basic fiber of the products we produce (self guided, community assisted learning software).

(* While not all actively involved in the company, all six of us – including the girls and the 13 year old – can program.)

Comment author: chloejune123 07 May 2012 08:28:24PM 7 points [-]

Hi! I found LW by HPMoR like so many other people, and I have found a lot of interesting articles on here. I'm only 12, so there are tons of articles that I don't understand, but I am determined to figure them out. My name is Chloe and I hope that we can be friends!

Comment author: Hang 07 May 2012 06:58:49PM 3 points [-]

I'm a master's candidate to Logic at UvA. Rationality is one of my interests, altough I seem to come from the opposite side of the specter of everyone at LessWrong (from metaphysics and philosophy to rationality).

I am very interested in observing the reductionist approach, even more so after learning Eliezer values GEB so highly.

Comment author: Alerus 07 May 2012 03:48:56PM 5 points [-]

Hi! So I've actually already made a few comments on this site, but had neglected to introduce myself so I thought I'd do so now. I'm a PhD candidate in computer science at the University of Maryland, Baltimore County. My research interests are in AI and Machine Learning. Specifically, my dissertation topic is on generalization in reinforcement learning (policy transfer and function approximation).

Given this, AI is obviously my biggest interest, but as a result, my study of AI has led me to applying the same concepts to human life and reasoning. Lately, I've also been thinking more about systems of morality and how an agent should reach rational moral conclusions. My knowledge of existing working in ethics is not profound, but my impression is that most systems seem to be at too high a level to make concrete (my metric is whether we could implement it in an AI; if we cannot, then it's probably too high-level for us to reason strongly with it ourselves). Even desirism, which I've examined at least somewhat, seems to be a bit too high-level, but is perhaps closer to the mark than others (to be fair, I may just not know enough about it). In response to these observations, I've been developing my own system of morality that I'd like to share here in the near future to receive input.

Comment author: wirov 07 May 2012 06:39:57PM 2 points [-]

Hi, I'm a (white, male) physics student from Germany and 20 years old. My main reason for not believing in any religion is Occam's razor. (I'm not sure whether this makes me atheist or agnostic. Any thoughts on that would be appreciated.)

I stumbled across HPMoR by accident in 2010 and read "Three Worlds Collide" and some other texts on Eliezer's personal website. During 2011, I did some Sequences-hopping (i.e. I started at one article and just followed inline links that sounded interesting, thus causing a tab explosion) I finally registered a few weeks ago to join the recent MoR discussion threads. For the future, I plan to read the Sequences in the intended order (which will probably take me until at least 2013) and join some other discussions from time to time.

Comment author: Drewsmithyman 03 May 2012 06:34:31PM *  4 points [-]

Hello community. My name is Drew Smithyman and I am an executive assistant at CFAR. I have not been with them long, nor have I been reading the sequences very long, but I intend to continue doing both.

I need to post a discussion thread about some interviews we need to do - could people please do me the favor of upvoting this comment twice so that I may start one as soon as possible?

Thank you.

Comment author: Brigid 01 May 2012 11:01:39PM 14 points [-]

Hi, I’m Brigid. I’ve been reading through the Sequences for a few weeks now, and am just about to start the Quantum Section (about which I am very excited). I found out about this site from an email the SIAI sent out. I’m an Signals Intelligence officer in the Marine Corps and am slated to get out of the military in a few months. I’m not too sure what I am going to do yet though; as gung-ho as I originally was about intel, I’m not sure I want to stay in that specific field. I was a physics and political science major in college, with a minor in women’s studies. I’ve been interested in rationality for a few years now and have thoroughly enjoyed everything I’ve read so far here (including HPMOR) . Also, if there is anyone who is interested in starting a Meetup group in Hawaii (Oahu) let me know!

Comment author: Eliezer_Yudkowsky 02 May 2012 06:07:45AM 3 points [-]

Hi, Brigid! Pleased to have you here! Experience has shown that by far the best way to find out if anyone's interested in starting an LW group is to pick a meeting place, announce a meetup time, and see if anyone shows up - worst-case scenario, you're reading by yourself in a coffeeshop for an hour, and this is not actually all that bad.

Comment author: shminux 01 May 2012 11:14:57PM *  3 points [-]

Welcome!

am just about to start the Quantum Section (about which I am very excited).

A warning: while the QM sequence in general is very readable and quite useful for the uninitiated, the many-worlds advocacy is best taken with a mountain of salt. Consider skipping the sequence on the first pass, and returning to it later, after you've covered everything else. It is fairly stand-alone and is not relevant to rationality in general.

Comment author: Eliezer_Yudkowsky 02 May 2012 06:11:27AM 3 points [-]

A meta-warning: Take shminux's "mountain of salt" advice with an equally large mountain of salt plus one more grain - as will become starkly apparent, there's a reason why the current QM section is written the way it is, it's not meant to be skipped, and it's highly relevant to rationality in general.

Comment author: thomblake 02 May 2012 03:41:47PM 6 points [-]

it's not meant to be skipped, and it's highly relevant to rationality in general.

A few people have asserted this, but how is it actually relevant? Is it just a case study, or is there something else there? As RichardKennaway asks, how does QM make a difference to rationality itself?

Comment author: shminux 03 May 2012 04:27:52PM *  1 point [-]

I have dutifully gone through the entire sequence again, enjoying some cute stories along the way, and my best guess of what EY means is that it is relevant not in any direct sense ("QM is what rationality is built on"), but more as a teaching tool: it brings "traditional Science" in conflict with "Bayesian rationality". (The Bayesianism wins, of course!) The MWI also lends some support to the EY's preferred model, Barbour's timeless physics, and thus inspires the TDT.

Comment author: thomblake 03 May 2012 04:41:33PM 2 points [-]

That still doesn't seem like enough to justify the reversal from "not relevant" to "highly relevant".

Comment author: shminux 03 May 2012 06:23:49PM 1 point [-]

What reversal? I still think that it detracts from the overall presentation of "modern rationality" by getting people sidetracked into learning open problems in physics at a pop-sci level. Whatever points EY was trying to make there can surely be made better without it.

Comment author: thomblake 03 May 2012 06:30:18PM 1 point [-]

What reversal?

I meant where you said "not relevant" and Eliezer responded with "highly relevant". It sounds to me as though he thinks it's fundamental to rationality or something. Very confusing.

Comment author: ArisKatsaris 02 May 2012 04:18:14PM *  2 points [-]

Speaking from a non-physicist perspective, much of what the QM sequence helped teach me is helping see the world from bottom-up; QM is regular, but it adds up to normality, and it's normality that's weird. Delving down into QM is going up the rabbit's hole away from weirdness and normality, and into mathematical regularity.

By analogy, normal people are similarly weird because they're the normality that was produced as the sum of a million years of evolution. Which in turn helps you realize that a random mind plucked out of mindspace is unlikely to have the characteristics we attribute to humanlike normality. Because normality is weird.

Once you go from bottom-to-top, you also help dissolve some questions like problems of identity and free will (though I had personally dissolved the supposed contradiction between free will and determinism many years before I encountered LessWrong) -- I still think that many knots people tie themselves over regarding issues like Quantum Suicide or Doomsday Dilemmas, are caused by insufficient application of the bottoms-up principle, or worse yet a half-hearted application thereof.

Comment author: DanArmak 02 May 2012 04:38:44PM *  2 points [-]

Because normality is weird.

It's bad enough that we've got people talking about things not being weird, as if weirdness is an objective property rather than something in the mind of the observer. Your words which I quoted are even worse; they're a self-contradiction.

If you're not willing to let the word "weird" have its dictionary definition, please, please just taboo it and let the subject die, rather than trying to redefine it as the opposite of the original meaning.

Comment author: chaosmosis 02 May 2012 04:42:37PM 2 points [-]

The commenter was saying "our intuitive understanding of reality" is weird, I think. That's why the commenter was able to noncontradictorily say that Quantum Mechanics fixed some problems and made things less weird.

Comment author: thomblake 02 May 2012 04:22:58PM 1 point [-]

Yeah, that's roughly the best I could come up with, but it doesn't seem sufficient. Noticing the extent of cognitive bias is enough to figure out that humans are weird.

Comment author: RichardKennaway 02 May 2012 06:38:24AM 12 points [-]

How would the Sequences be different, other than in the QM parts, if we lived in a classical universe, or if we had not yet discovered QM?

Comment author: [deleted] 02 May 2012 07:22:38AM 2 points [-]

Wild Mass Guessing: in a classical universe, particles are definable individuals. This breaks a whole mess of things; a perfect clone of you is no longer you, and etc.

Comment author: JGWeissman 02 May 2012 05:21:26PM 9 points [-]

a perfect clone of you is no longer you

The lack of identity of individual particles is knock down argument against our identities being based on the identities of individual particles. However, if there was identity of individual particals, this does not require that the identity of individual particles contribute to our identities, it would just remove a knock down argument against that idea.

Comment author: DanArmak 02 May 2012 05:45:30PM *  1 point [-]

(Almost) all the particles in our bodies are replaced anyway, on the scale of a few years. Replacement here means a period of time when you're without the molecule, and then another comes in to take its place; so it's real whether or not particles have identities. This applies to quite large things like molecules. Once we know that, personal identity rooted in specific particles is shaky anyway.

Comment author: thomblake 02 May 2012 05:29:19PM *  1 point [-]

An important point.

Heraclitus probably didn't believe in lack of identity of individual particles, but he did believe we are patterns of information, not particular stuff.

EDIT: On second thought, he'd probably work out lack of identity of individual particles if pressed, following from that.

Comment author: DanArmak 02 May 2012 05:50:04PM 4 points [-]

a perfect clone of you is no longer you

Not necessarily. "What/who is you" is a matter of definition to a large extent. If particles have identities (but are still identical to all possible measurements), that doesn't stop me from defining my personhood as rooted in the pattern, and identifying with other sufficiently similar instances of the pattern.

Comment author: RichardKennaway 02 May 2012 07:54:31AM 3 points [-]

That minds are physical processes seems discoverable without knowing why matter is made of atoms and what atoms are made of. That elimination of mentalism seems sufficient to justify the ideas of uploading, destructive cryonics, artificial people, and so on.

But I'm actually more interested in what implications there are, if any, for practical rationality here and now. (I will be unmoved by the answer "But FAI is the most practical thing to work on, we'll all die if it's done wrong!!!")

Comment author: fubarobfusco 02 May 2012 01:15:49AM *  3 points [-]

Well, there are a couple of things going on in the QM sequence. One of them is MWI. The other is the general debunking of the commonly-held idea that QM is soooooooo weeeeeeeeird.

Comment author: shminux 02 May 2012 02:00:39AM 1 point [-]

Yes, that's the good part.

Comment author: Bart119 01 May 2012 07:06:22PM 2 points [-]

I stumbled here while searching some topic, and now I've forgotten which one. I've been posting for a few weeks, and just now managed to find the "About" link that explains how to get started, including writing an intro here. Despite being a software engineer by trade these past 27-odd years, I manage to get lost navigating websites a lot, and I still forget to use Google and Wikipedia on topics. Sigh. I'm 57, and was introduced to cognitive fallacies years as long ago as 1972. I've tried to avoid some of the worst ones, but I also fail a lot. I kept a blog with issue-related essays for a while, and whatever its shortcomings, I was proud of the fact that when I ran out of thing to say, I stopped posting. With the prospect of a community like this one that might respond substantively, maybe I'll be inspired to write more here.

This description of a guy who believed in objective morality but lost his faith impressed me a lot. That's me. I don't think there's any very compelling reason to live one's life in a particular way, or any real reason that some actions are preferable to others. That might be called nihilism. I live a decent life, though, because I'm happier pretending not to be a nihilist and making moral arguments and living honorably and all. But when the going gets tough (as in unpleasant consequences to some line of thought that doesn't make me happy), I always have the option of shrugging my shoulders, yawning, and going on to the next topic. Rationality too is a fun tool. I find it most helpful within the relatively small questions of life.

Comment author: avichapman 30 April 2012 10:09:37PM 2 points [-]

Hi,

I'm a software engineer in Adelaide, Australia. I've tried to be a rationalist all of my life, but had no idea that there were actual techniques that you can learn from others. I'd simply tried to confront myself on the biases that books told me I had, with various degrees of success. I'm very excited to be here.

One thing that bothers me, though, is that I am feeling increasingly isolated from others. It used to be that I had thought just enough to be 1 inferential step ahead of others. This made me seem smart when I talked. Now, I'm more than 1 inferential step ahead in many areas for many people, and this leads to confusion and a lack of communication. Now people think I'm crazy and ignore me. Well, except for those of my friends who are coming with me on this journey. I hope being part of this community will a good social experience. And if anyone here is from Adelaide, I'd love to meet you in person!

Is there any way for a newbie to ask questions of an old hand? A few weeks ago, I read about using Bayes' Theorem to evaluate evidence. Now, I see its use everywhere. I just read a post on Pharyngula that took what seemed like a very emotional stance on what also seemed to be able to be perfectly modelled with a Bayesian equation. Without the actual percentages, I had to make certain assumptions about relative values, but came to a surprising conclusion. Now I need someone to check my work and tell me if I did it wrong.

Anyway, I'm glad to meet you all! Avi

Comment author: blob 25 April 2012 02:57:10PM 9 points [-]

Hello!

I'm a mathematician and working as a programmer in Berlin, Germany. I read HPMOR after following a recommendation in a talk on Cognitive Psychology For Hackers and proceeded to read most of the sequences.

Reading LW has had several practical consequences for me: Spaced repetition is awesome for memorizing things I value. Efficient charity has lead to me giving more and being more confident about it actually having positive effects. I read a book on small talk and try to practice. I stopped using cheap multivitamin supplements and currently only take vitamin D. My spare time is mostly used to care for my daughter these days - I got some modafinil and am experimenting with getting extra time this way.

I'm also part of a small Berlin LW meetup: spuckblase and me have met twice - and now we got contacted by two other Berlin based lurkers which prompted the creation of a wiki entry and a mailing list. We're now planning the first meetup that will actually get a meetup post and be announced in advance.

Comment author: olalonde 24 April 2012 10:54:20PM *  5 points [-]

Hi all! I have been lurking LW for a few months (years?). I believe I was first introduced to LW through some posts on Hacker News (http://news.ycombinator.com/user?id=olalonde). I've always considered myself pretty good at rationality (is there a difference with being a rationalist?) and I've always been an atheist/reductionist. I recently (4 years ago?) converted to libertarianism (blame Milton Friedman). I was raised by 2 atheist doctors (as in PhD). I'm a software engineer and I'm mostly interested in the technical aspect of achieving AGI. Since I was a kid, I've always dreamed of seeing an AGI within my lifetime. I'd be curious to know if there are some people here working on actually building an AGI. I was born in Canada, have lived in Switzerland and am now living in China. I'm 23 years old IIRC. I believe I'm quite far from the stereotypical LWer on the personality side but I guess diversity doesn't hurt.

Nice to meet you all!

Comment author: olalonde 24 April 2012 11:14:44PM 1 point [-]

Before I get more involved here, could someone explain me what is

1) x-rationality (extreme rationality) 2) a rationalist 3) a bayesian rationalist

(I know what rationalism and Bayes theorem are but I'm not sure what the terms above refer to in the context of LW)

Comment author: Nornagest 24 April 2012 11:37:38PM *  4 points [-]

In the context of LW, all those terms are pretty closely related unless some more specific context makes it clear that they're not. X-rationality is a term coined to distinguish the LW methodology (which is too complicated to describe in a paragraph, but the tagline on the front page does a decent job) from rationality in the colloquial sense, which is a much fuzzier set of concepts; when someone talks about "rationality" here, though, they usually mean the former and not the latter. This is the post where the term originates, I believe.

A "rationalist" as commonly used in LW is one who pursues (and ideally attempts to improve on) some approximation of LW methodology. "Aspiring rationalist" seems to be the preferred term among some segments of the userbase, but it hasn't achieved fixation yet. Personally, I try to avoid both.

A "Bayesian rationalist" is simply a LW-style rationalist as defined above, but the qualification usually indicates that some contrast is intended. A contrast with rationalism in the philosophical sense is probably the most likely; that's quite different and in some ways mutually exclusive with LW epistemology, which is generally closer to philosophical empiricism.

Comment author: Bugmaster 24 April 2012 11:41:31PM 2 points [-]

As far as I understand, a "Bayesian Rationalist" is someone who bases their beliefs (and thus decisions) on Bayesian probability, as opposed to ye olde frequentist probability. An X-rationalist is someone who embraces both epistemic and instrumental rationality (the Bayesian kind) in order to optimize every aspect of his life.

Comment author: curiosity 19 April 2012 04:58:19AM 3 points [-]

hello! I was introduced to LessWrong through HPMOR. I find rationality interesting as someone who was brought up in an extremely religious household, and trying to wade through what I actually believe rather than what I was taught.

I'm seventeen and am interested in the rationality summer summer camp, but the "gifted in math" part is stopping me short. I'm in honors and ap classes, but I'm not especially amazing at math, nor am I especially bad at it. Is genuine interest in the subject matter enough?

Comment author: Nisan 19 April 2012 06:22:06AM *  1 point [-]

The announcement for the May, June, and July minicamps don't mention a "gifted in math" requirement. You should definitely apply!

EDIT: The May, June, and July minicamps differ from the August minicamp in having less advanced math. This doesn't mean they're less useful! They do cost money but there may be scholarships available. And there's no reason you can't apply to multiple camps.

Comment author: Bugmaster 19 April 2012 05:10:35AM *  1 point [-]

Is genuine interest in the subject matter enough?

I may be jaded, but IMO having a "genuine interest" in math would already put you in the 99th top percentile of the population. This might not be as good as being "gifted" (whatever that means), but it should at least be close enough for a rationality camp.

Edited to add:

DISCLAIMER: I myself have never been through the rationality camp, so I'm just guessing here.

Comment author: troll 17 April 2012 08:34:44PM 13 points [-]

minimalist, 17, white, male, autodidact, atheist, libertarian, california, hacker, studying computer science, reading sequences, intellectual upbringing, 1 year bayesian rationalist, motivation deficient, focusing on skills, was creating something similar to bayesian rationality before conversion, have read hpmor (not intro to lw), interested in contributing to ai research in the future

Comment author: jimrandomh 24 April 2013 05:09:42PM 6 points [-]

Consider restarting with a different account name. Trolling (that is, trying to provoke people) is not welcome here, and when your username is "troll", people will not (and should not) give you the benefit of doubt.

Comment author: Bugmaster 24 April 2013 11:19:35PM 1 point [-]

You weren't kidding when you said "minimalist". Nicely done.

Comment author: troll 24 April 2013 11:30:22PM 1 point [-]

I guess a lot of people are interested enough in an account with the handle "troll" to check my first post, but not enough to not consider the name when reviewing posts.

Comment author: Bugmaster 24 April 2013 11:37:49PM 2 points [-]

Realistically, when someone replies to one of my posts on some long thread, I don't take the time to click through their handle and find their own intro post. I don't think that doing so is a good use of my time, and I believe that I am typical in this regard. However, I do take the time to read their handle, and if it seems to say "I am not arguing in good faith", I take notice.

This gives me an idea for a new Less Wrong feature, though: allow users to enter a short descriptions of themselves, and display it when the mouse hovers over their handle for a certain amount of time. I know how I'd implement it with jQuery, but I'm not sure how easy it would be to plug into the LW general architecture.

Comment author: Sniffnoy 24 April 2013 11:51:47PM 1 point [-]

I think it would be simpler to just allow people to add a short description of themselves to the user page. (And then maybe later the hovering thing can be added if people want that.)

Comment author: thomblake 18 April 2012 11:56:39PM 8 points [-]

I'm sure you're aware at this point, but with that description you blend into the wallpaper.

Thank you for creating a comment to link "stereotypical Less Wrong reader". If only you were a couple of years older.

Since you're 17, have you looked into the week-long summer camp?

Comment author: troll 24 April 2012 10:42:23PM 1 point [-]

I have and I have submitted an application.

Comment author: RichardKennaway 18 April 2012 11:20:35AM 9 points [-]

The Identikit LessWrongian!

Comment author: Oscar_Cunningham 17 April 2012 10:31:50PM *  9 points [-]

"Minimalist" is implied by the sparsity of the rest of the comment, and so is ironically redundant.

Comment author: troll 17 April 2012 10:39:47PM 9 points [-]

There are a few other reasons I could be formatting my introduction that way, such as being bad at English or writing in general. I used "minimalist" both as a heads up for the format and to draw away from the other possible explanations.

Comment author: MarkusRamikin 17 April 2012 09:08:52PM 4 points [-]

That handle bodes well.

Comment author: Multiheaded 17 April 2012 10:00:36PM 7 points [-]

On an elitist gaming forum I used to frequent (RPG Codex), we called such things "post-ironic" (meaning "post-modern as fuck online performance art").

Basically the joke is that everyone gets the joke, and that allows its author to act as if it was no joke, and self-consciously reference that fact - which is the joke.

Comment author: Emile 17 April 2012 09:01:42PM 4 points [-]

Welcome to LessWrong!

(For a cheap way to give a better impression, you may want to switch to another user name)

Comment author: shokwave 17 April 2012 09:00:17PM 3 points [-]

Contrarian?

Comment author: troll 17 April 2012 09:31:29PM 10 points [-]

No.

Comment author: DSimon 19 April 2012 12:13:42AM 1 point [-]

Anti-contrarian?

Comment author: troll 24 April 2012 10:25:06PM 2 points [-]

If you mean 'against people who are contrarian', no. If you mean 'for popular opinions', no.

Comment author: rejuvyesh 16 April 2012 10:43:43AM 6 points [-]

Hello everyone!

I am Jayesh Kumar Gupta. I am from Jodhpur, India. I have been interested in rationality for some years now. I came across this site via HPMOR. I had been reading posts on the site for some years now, while trying to wade my way through the gigantic Sequences, but was not confident enough to join this group, (people here seem to know so much). Right now I am an undergraduate student at IIT Kanpur. Hopefully I too will contribute something to the site in the future.

Thanks!