Posting for the first time because I feel I could maybe use some help. [And yes, I know of the Welcome Thread, but I think the Open Thread gets more attention so I'm posting first here. Maybe later I'll post in the Welcome Thread.]
I come from a very religious family and community, but I'm a closet atheist. (More accurately, I'd label myself agnostic leaning atheist with regard to the existence of one or more intelligent world-designer(s), but I give almost no credence to any religious claims beyond that. In any case, for simplicity I'm just going to refer to myself here as an atheist.)
I have only a single very close friend who knows of my atheism. 5 or 6 other people know I disagree with all the standard religious arguments, but they think that I've opted for "blind faith" and I'm still religious. Most of my family and friends, however, although they know that I'm unusually open-minded and intellectual for my close-minded religious community (and they look at me a bit strangely for that), still think that I'm fully religious.
A bit of background: I started doubting in high school, but it didn't turn into a full-fledged crisis of faith until I was about 18 or 19. Eventually...
Paul Graham wrote an article called What You Can't Say that seems somewhat relevant to your position, and in particular engages with the instrumental rationality of epistemic rationality. I bring that one up specifically because his conclusion is mostly "figure out what you can't say, and then don't say it." But he's also a startup guy, and is well aware of the exception that many good startup ideas seem unacceptable, because if they were acceptable ideas they'd already be mature industries. So many heresies are not worth endorsing publicly, even if you privately believe them, but some heresies are (mainly, if you expect significant instrumental gains from doing so).
I grew up in a Christian household and realized in my early teens that I was a gay atheist; I put off telling people for a long time and I'm not sure how much I got from doing so. (Both of my parents were understanding.) Most of my friends were from school anyway, and it was easy to just stop going to church when I left town for college, and then go when I'm visiting my parents out of family solidarity.
My suspicion is that your wife would prefer knowing sooner rather than later. I also predict that it is not going to get easier to tell her or your children as time goes on--if anything, as your children age and absorb more and more religious memes and norms, the more your public deconversion would affect them.
I think that your edit clarified things for me substantially. I read the entire article that you linked. I regret my earlier post for reasons that you will hopefully see.
I have a relevant anecdote about a simpler situation. I was with two friends. The One thought that it would be preferable for there to be less and/or simpler technology in the world, and the Other thought that the opposite was true. The One believed that technology causes people to live meaningless lives, and the Other conceded that he believed this to be true but also believed that technology has so many other benefits that this is acceptable. The One would always cite examples of how technology was used for entertainment, and the Other, examples of how technology was used for work. I stepped in and pointed out the patterns in their respective examples. I said that there were times when I had wasted time by using technology. I pointed out that if a person were like the One, and thus felt that they were leading a less meaningful life by the use of technology, then they should stop. It would be harmful were I to prescribe that a person like the One indiscriminately use technology. I then said that, through technolog...
I have never been in a situation similar to yours, so my advice may be wrong, but here it is anyway.
When people change their opinion, they sometimes go from one extreme to the opposite extreme, as if to make sure they would not drift back to their old position. But there is no need for sudden large changes. Unlike religious people, atheists do not have a duty to proselytize everyone to their non-belief. To put it bluntly, you are allowed to lie and deceive, if it is necessary for your survival. I do not support lying in general, because it has its cost, but sometimes telling the truth (at the wrong moment) has a much greater cost. The cost of lying is weakening the relationship with people you lie to. So I think you should try to be open with your wife (but be careful about your coming out), but lying to everyone else is an option.
When explaining how you feel, focus on the positive parts, not the negative parts. Rejecting religion is the negative part. It is not your terminal value to be non-religious. You probably still like some aspects of the religious culture; and that's okay. (Atheists are free to celebrate Christmas, if they choose to.) It's just that your positive values are...
(I take it "follow me" means "stay married to me despite the overt religious difference" rather than "deconvert along with me".)
Keeping secrets from your wife seems like a really bad idea. Are there ways for you to test the waters a little? (Admit to having serious doubts about your religion, maybe?) Perhaps there's something you can do along those lines that will both (1) give you some indication of what you can tell her without hurting her / making her file for leave you / ... and (2) prepare her mind so that when you tell her more it isn't such a shock.
My situation somewhat parallels yours -- formerly quite seriously religious, now very definitely (and openly) atheist, married to someone who is still seriously and actively religious. But my guess, from how you describe the situation, is that your family and friends are likely to be more bothered by irreligion than mine. (In particular, both I and my wife have plenty of friends and family who are not religious.) So I can tell you that it's all worked out OK for me so far, but I wouldn't advise you to take that as very strong evidence that openness about your (ir)religious opinions would work out well for you.
Even so, my guess is that it wouldn't be as terrible as you think it would. But, again, I don't think there's any reason for you to trust my guesses.
Large multipurpose charities like Oxfam are difficult to evaluate and (perhaps mostly for that reason, perhaps not) don't get recommendations from organizations like Givewell.
Is there anything resembling a consensus on the effectiveness of any of these charities? Better still, a comparison of them with (one or more of, or a crude estimate of the effectiveness of) Givewell's top charities?
This seems like it might be useful for at least four reasons.
Firstly, for reasons similar to Holden Karnofsky's for skepticism about questionable high-EV causes, some givers might prefer to give to a charity that does lots of obviously-probably-valuable things rather than one that does a single thing that seems to be very valuable but where some single error (e.g., it turns out that distributing mosquito nets just results in mosquitos evolving resistance and after a couple of years the nets no longer do much good and other ways of dealing with the mosquitos have become less effective) could make it hugely less valuable or even harmful. So if it turns out that Oxfam is half as effective (in expectation) as AMF, you might still prefer Oxfam on these grounds.
Secondly, some givers may be uneasy about w...
How would you respond if I said I'm a rationalist, however I don't feel a strong motivation to make the world a better place?
To be clear, I do recognize making the world a better place a good thing, I just don't feel much intrinsic motivation to actually do it.
I guess in part it's because I expect genuinely trying to improve things (rather than making a token effort) to be a rather difficult and thankless task.
Also, as far as I can tell, my psychological makeup is such that feeling, thinking or being told that I'm "obligated" to do something actually decreases my motivation. So the idea that "I'm supposed do that because it's the ethical thing to do" doesn't work for me either.
I do like the idea of making the world a better place as long as I can do that while doing something that inspires me or that I feel good about doing. Part of the reason, I think, is that I don't see myself being able to do something I really don't enjoy for long enough that it produces meaningful results. So in order for it work, it pretty much has to be something I actually like doing.
In the end, I estimate that I'm more likely to accomplish things with social benefit if I focus on my ow...
The standard pledge for people in the rationalist sphere trying to make the world a better place is 10% of income to efficient charities, which if you're making the typical kind of money for this site's demographics, is closer to "token" than "difficult and thankless task", even if it's loads more than most people do.
Personally, my own response was to notice how little guilt I felt for not living up to moral obligations and decide I was evil and functionally become an egoist while still thinking of utilitarianism as "the true morality".
To any of you football fans out there, I think the outrage over the Seahawks' decision to throw it on the goal line is a classic example of hindsight bias. Throwing on the goal line is hardly unheard of, and they couldn't run it 3 times anyway. This FiveThirtyEight article explains why throwing actually was a good decision. Anyway, everyone thinks that the decision to throw it was terrible, and I think that they're being victims to the hindsight bias.
On LessWrong, or on blogs by LWers, advice has been given on how to become bisexual, or polyamorous.
However, there is no advice on LessWrong for how to stop liking something. Yet there are many stories of people having great difficulty giving up such things as video games and internet distractions. It seems to be easier to acquire a taste than to relinquish it.
All the advice on resisting video games and the like (internet blockers, social support) has been on using tricks of one sort or another to restrict the act, not the desire. Even when experimenting with specific deeds, it is easier to try something in spite of aversion than to forego it in spite of attraction.
Are there effective methods of ceasing to enjoy some activity, or of refraining from enjoyable things? What presently enjoyable activities would you use them on?
All the advice on resisting video games and the like (internet blockers, social support) has been on using tricks of one sort or another to restrict the act, not the desire.
Some advice is about substitution, i.e. you identify the emotional need driving a stubborn behavior, and find a more approved behavior than satisfies the same need.
What's wrong with "giving advice on how to become heterosexual, or monogamous" to someone who wants to become heterosexual or monogamous?
A thought about heritability and malleability:
The heritability of height has increased, because the nutritional environment has become more uniform. To be very specific, "more equal" means both that people have more similar sets of options, and that they exercise similar preferences among these options.
This is interesting, because the increased heritability has coincided exactly with an increased importance of environmental factors from a decision making standpoint. In other words, a contemporary parent picking from {underfeed kids, don't underf...
Might it be reasonable to think of the anti-vaccination movement as people trying to take heroic responsibility without having good judgement?
I've recently had a discussion about ethics here in this thread, and the conclusion I've arrived at is that a big reason for my lack of motivation is lack of social support.
I don't know if this is the right place to post this, nor am I fully clear on what kind of response I am expecting. I guess I would like advice and emotional support with this issue.
I have basically been in shutdown mode for the past year because I'm not getting the kind of support I need, and I have my doubts I will ever get the kind of support I need.
I am in my mid-twenties, highly in...
Since, neither is listed on the best textbooks thread, can anyone recommend good textbooks for
1) Social psychology
2) Cognitive psychology
?
I think polyamory is big in the rationalist community; what is the consensus on the effects of experimenting with it on later satisfaction with monogamy?
Disclaimer: the identity theory that I actually alieve is the most common intuitionist one, and it's philosophically inconsistent: I regard as death teleportation but not sleeping. This comment, however, is written from System 2 perspective, that can operate even with concepts that I don't alieve
The basic idea behind timeless identity is that "I" can only be meaningfully defined inductively as "an entity that has experience continuity with my current self". Thus, we can safely replace "I value my life" with "I value the e...
In a previous thread, I brought up the subject of entropy being subjective and got a lot of interesting responses. One point of contention was that if you know the positions and velocities of all the molecules in a hot cup of tea, then its temperature is actually at absolute zero (!). I realized that the explanation of this in usual terms is a bit clumsy and awkward. I'm thinking maybe if this could be explained in terms of reversible operations on strings of bits (abstracting away from molecules and any solid physical grounding), it might be easier to pre...
Isn't all this just punning on definitions? If the particle velocities in a gas are Maxwell-Boltzmann distributed for some parameter T, we can say that the gas has "Maxwell-Boltzmann temperature T". Then there is a separate Jaynes-style definition about "temperature" in terms of the knowledge someone has about the gas. If all you know is that the velocities follow a certain distribution, then the two definitions coincide. But if you happen to know more about it, it is still the case that almost all interesting properties follow from the coarse-grained velocity distribution (the gas will still melt icecubes and so on), so rather than saying that it has zero temperature, should we not just note that the information-based definition no longer captures the ordinary notion of temperate?
The LessWrong logo seems to be broken at http://wiki.lesswrong.com/wiki/How_To_Actually_Change_Your_Mind.
(more generally, there's no clear place to post about technical issues)
My last vaccination was when I was 8 in Germany. There was noone in my teenage years. I'm now 28. I'm male. To what extend is it worthwhile for me to go to a doctor now for vaccination?
Should we be concerned about the exposure to RF radiation? I always assumed that no, since it doesn't affect humans beyond heating, but then I found this:
http://www.emfhealthy.com/wp-content/uploads/2014/12/2012SummaryforthePublic.pdf
http://www.sciencedirect.com/science/article/pii/S0160412014001354
The only mechanism they suggest for non-thermal effects is:
changes to protein conformations and binding properties, and an increase in the production of reactive oxygen species (ROS) that may lead to DNA damage (Challis, 2005 and La Vignera et al., 2012)
One ...
If cars were just invented yesterday, knowing what you know about humans, would you think that it'd be sane to let people drive the way they currently do (speeds, traffic, conditions...)? I wouldn't.
What's LessWrong's collective mind's opinion on efficient markets hypothesis? From Facebook feed I vaguely recall Eliezer being its supporter, it also appeared in some of the Sequences. On the other hand, there is a post published here called A guide to rational investing, which states that "the EMH is now the noble lie of the economics profession".
I have a well-read layman's understanding of both the hypothesis and various arguments for and agains it and would like to know what this community's opinion is.
False: There are no $20 bills lying on the ground because someone would have picked them up already.
True: If there are a lot of people scanning the ground with high-powered money detectors, you are not going to find enough $20 bills with your naked eye to make a living on.
What are some papers arguing that one shouldn't dedicate almost all efforts to decrease existential risk? I ask this because all the papers I've read have made extremely good arguments on why decreasing x-risk is important, but I've found none saying that it's not so important, and I want to be informed before spending so much time and effort decreasing x-risk.
For a couple of days, I've been trying to explain to pinyaka why minds-in-general, and specifically, maximizers, are not necessarily reward maximizers. It's really forced me to flesh out my current understanding of AGI. I wrote the most detailed natural language explanation of why minds-in-general and maximizers are not necessarily reward maximizers that I could muster in my most recent reply, and just in case it still didn't click for pinyaka, I thought I'd prepare a pseudocode example since I had a sense that I could do it. Then I thought that instead of...
This columnist argues that more personal freedom is worth a few more sick people dying. In other words, preventing death from disease is not a terminal goal for him; it's sacrificeable for his actual terminal goal of less government intrusion. Setting aside the mindkill potential over Obamacare, I find his choice of terminal goals worrying.
preventing death from disease is not a terminal goal for him; it's sacrificeable
You're using a wrong framework which assumes that in every choice there must be only one terminal goal, if you sacrifice anything that sacrifice is not terminal.
A more useful framework would recognize that there is a network of terminal (and other) goals and that most decisions involve trade-offs. It's very common to give up a measure of satisfaction of some terminal goals in order to achieve satisfaction of other terminal goals.
In this specific case, trading off death from disease against government intrusion sounds like a normal balance to me -- your choice is a function of your values and how much death prevention you get/avoid in exchange for how much of government intrustion. In specific situations I can see myself leaning either this way or that way.
I find your worry over the trade-off between terminal goals worrying :-P
If you drive, cross the road, eat desserts, etc., then you are (for yourself) trading off your own prospects of life and death against other things.
In college, I had a professor ask us to pick any subject, make-up any 'facts', and try to make a compelling argument. He then had us evaluate others peoples essays. Let's just say I wasn't impressed with some of my fellow classmate's arguments.
Sometimes you see this in the courtroom as a failure to state a claim
Would it be interesting to have an open thread where we try this out?
[pollid:814]
Can anyone please fill me in on what's the big damn deal with PUA on Lesswrong? It seems to be geared toward screwing women who are only as deep as their genitals get. Who the hell cares? Pretty sure every guy here would love a girl to have fun conversations with. Pretty sure
Most of the stuff I've seen against it is about scaring people or making LW a semi-cult or some other non-solution to a problem some guys REALLY want to solve here, without going to /r/socialskills and following an endless recursion of useless pathetic nonsense, as per my experience. M...
For a couple of days, I've been trying to explain to pinyaka why not all maximizers are reward maximizers. It's really forced me to flesh out my current understanding of AGI. I wrote the most detailed natural language explanation of why not all maximizers are reward maximizers that I could muster in my most recent reply, and just in case it still didn't click for him, I thought I'd prepare a pseudocode example since I had a sense that I could do it. Then I thought that instead of just leaving it on my hard drive or at the bottom of a comment thread, it mig...
For a couple of days, I've been trying to explain to pinyaka why not all maximizers are reward maximizers. It's really forced me to flesh out my current understanding of AGI. I wrote the most detailed natural language explanation of why not all maximizers are reward maximizers that I could muster in my most recent reply, and just in case it still didn't click for him, I thought I'd prepare a pseudocode example since I had a sense that I could do it. Then I thought that instead of just leaving it on my hard drive or at the bottom of a comment thread, it mig...
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Previous Open Thread
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.