If you have no interest in eventually procreating, is serious dating worth the massive time and emotional investment necessary?
Edit: part of the reason i am asking is for external belief checking
Off the top of my head, some reasons why people would to marry despite intending not to have children:
This is a stupid question thread and I intentionally asked a question I thought was stupid. I think that lifelong companionship and emotional intimacy would be amazing. However, none of my attempts at achieving those resembled anything like that and when I look at people around me in relationships I don't see it there either.
This doesn't make me bitter or get weird stupid ideas about relationships like completely basing them on sex or abstaining due to not being interested in procreating. It doesn't make me draw any conclusions and it doesn't stop me from wanting a fulfilling relationship with someone else. All it made me do was stop for a second and double check (in a thread for stupid questions) that real, fulfilling relationships are a thing that actually exist within reality, not some sort of Hollywood bullshit, and are worth the effort to obtain and maintain. I can imagine all sorts of things, but checking that this sort of thing is actually real seemed like it could be worth a 30 second forum post.
Additionally, do you have experience of or have evidence that the benefits of companionship and emotional intimacy are worth high emotional and time investment costs? I am genuinely curious.
I need some list of biases for a game of Biased Pandemic for our Meet-Up. Do suitably prepared/formatted lists exist somewhere?
I have been trying to meditate and can go about 7 minutes before boredom overwhelms me. Does it get easier?
How do you reconcile being transgender with the fact that a lot of our sexual roles are culture-specific? For instance, imagine a MTF who wants to wear a dress. You can't tell this person "stop wearing dresses"; their desire to do so cannot be changed by society telling them no. Yet if they lived in another culture that didn't have dresses at all even for women, we know that when society told them not to wear a dress they would have eagerly gone along with it.
Someone - I think Brienne - recently blogged about how it feels to have become a rationalist and that we need more insight into how people become rationalists. After the fact being a rationalist feel so normal what we have difficulty understnd what has been different. That we'd need a phenomenology of rationalists or something like that. I wanted to follow-up obn that. but I can't find the post. Maybe it wasn't by Brienne. My google-fu failed me. Does anybody know which post I mean?
There was recently a lethal heat wave in Karachi.
If you go about 1000 meters below the surface of the ocean, the water gets very cold.
Why don't people try to cool off hot places by piping cold water up from the ocean? Or just bubbling air through the deep water?
My iPad is running out of space. I want to delete some games but somehow retain their save files in case I want to download and play them again. How can I do this without jailbreaking my iPad?
Not sure if this is the right place for this; if not I will be happy to move this to a more appropriate location. I just graduated college, and plan on working for a year as a math tutor. After that, I don't really have any fixed plans, and lately I have been wondering about possibly trying to work for MIRI/CFAR/similar organizations. What exactly is needed to get involved? And if this appears feasible, what should I be working on during the gap year to be ready?
Is this the evident interpretation of Quantpedia's visual statistical summary of data on published quantitative trading strategies : Simple strategy, daily, stock strategy based on trading earnings or earning announcement generally outperforms alternatives?
Is room for more funding zero sum?
If gates foundation see's Effective Altruists are funding something, they're not going to fund it if they use Room for more Funding Reasoning.
It costs EA's proportionally more of their discretionary income than high net worth individuals.
What is the joke behind the title "Highly Advanced Epistemology 101 for Beginners"? I understand that it's redundant, but is that the only reason why it's supposed to be funny, or is there some further underlying joke?
Edit: Or, to be clearer, why was the title not just " Highly Advanced Epistemology 101"? I understand that there may be a separate joke given the juxtaposition of "Highly Advanced" and "101".
I think it's not such a big deviation from the low-hanging fruit posts/comments I occasionally see here but here goes anyway: what are things I shouldn't miss? Books are the only thing on my mind now, but anything can be suggested.
Is MIRI making an FAI only in regards to humans? That is, it would do whatever best aligned itself with what humans want?
If so, what would happen in the case of extra terrestrial contact? All sorts of nasty situations could occur e.g. them having an AI as well, with a fairly different set of goals, so the two AIs might engage in some huge and terrifying conflict. Or maybe they'd just agree to co-operate because the conflict would be too costly.
So have the researchers at MIRI put something like this as a goal?
This thread is for asking any questions that might seem obvious, tangential, silly or what-have-you. Don't be shy, everyone has holes in their knowledge, though the fewer and the smaller we can make them, the better.
Please be respectful of other people's admitting ignorance and don't mock them for it, as they're doing a noble thing.
To any future monthly posters of SQ threads, please remember to add the "stupid_questions" tag.