Posts

Sorted by New

Wiki Contributions

Comments

The WHO guidance is that the person who is infected uses a mask as much as possible, that they use the bathroom for evening routine last (after you) and that they clean it/disinfect it after use.

Thank you. I admit I didn't understand most of what you said. Sorry.

I tried meditation once and it was terrible. Emptying my head allows all the negativity to come in :).

You're referring to http://lesswrong.com/lw/js/the_bottom_line/, yes?

It just seems like those situations don't present themselves very often. More often, a situation presents itself like this: A team member makes all the wrong arguments to support a thesis I disagree with. Previously, I would just fight against each of his arguments. Now, I don't do anything much (I'm not good at convincing people) but instead keep thinking "yeah, yeah, arguments, soldiers, written the bottom line, blah, blah" without it actually being useful.

That's not really related to my main problems, but in general, rationality for me poses that problem, that merely thinking in a Sequences-like fashion causes me to feel sad and doesn't bring enough benefit to compensate, I suspect.

Huh, um, okay! ^^ I somewhat suspected that that is the case, but it's still hard to believe it. (thank you)

That's helpful, thank you! Your mention of Fluid Dynamics was particularly nice.

Happy to share.

Part of is what I replied to ChristianKl: that I feel like every important thing should be reduced to thinking about EA or AI. This makes me think that I can't find any new areas interesting because they're not good, or should be eliminated to divert resources into one of those two items.

Another part, I think, is that previously, I could always discover a... greater... area of interest once I grew out of the old one. When I got good at video games, I could move on to try making friends, then to do volunteering, then to attend college, then to try living alone, but, once I started to seriously think through the Sequences, which made read other Yudkowsky's writing, including, sigh, "The Road to Singularity", the next thing then was "saving the world" and it doesn't feel like I can get much further from there.

Well, yeah, but it also gives a lot of answers and provides an argument to everything. Also it feels like reading from the community reduces everything to either effective altruism or AI. It may not be true but I've internalized it so much that now I can't listen to, say, any politician's statement on state budget, without EA or AI funding immediately coming to mind. Or even any economics or politics or important decision making without feeling like "this is all wrong and we shouldn't care about it". It's a little disheartening :)

Not neccessarily cause of depression.

I mean, I am suspicious and I believe it has something to do with it. At my worst during depressive periods, I keep thinking about death, altruism, rationality, AI etc. Also, there's these surveys that tell me that LessWrong members are unusually likely to have depression.

But I think my depression is mostly innate.

I keep doing that but it's kind of hard, and I can't easily get a proof of what's causing the problem.

Load More