Where this chain of reasoning breaks down for me is in the "without resistance" phase of "take right action without resistance". If the resistance, both conscious and unconscious, is too strong, there will be no right action taken, whether I will it or no. So what I do instead is undermine the resistance itself. This is my precondition for taking right action. Do you see what I mean? Wu-wei prevents hedonism if wu-wei is essential to hedonism but there can be no wu-wei.
The similarity between our approaches is as you say: the realization that akrasia defeats frontal assaults with heavy casualties. The difference is that you are doing something like the "take right action without resistance" approach that I've encountered before in Buddhism, which matches up nicely with anhedonia (personally I am a hedonist, so this does not work for me); while I am attempting to root out the basic causes of my akrasia, down to the very sources, to change the way I feel in the first place. Both approaches have their merits, and I...
Two Disclaimers: First, I am not a doctor. Second, beware of other-optimizing. This advice is working well for me, but it may not work well for others.
The depression became obvious and major enough that I was forced to take action to stop it. The rationalizations had run dry, so I fully realized in both System 1 and System 2 that I was not "unmotivated", I was mentally ill. Years of life hacks and half-assed lifestyle interventions had accomplished some, but not enough, so it was time for medications, which I had previously feared due to bad expe...
Agreed. Personal anecdote: once I redefined my "motivation problem" as a "depression and anxiety problem" a number of months ago, and began treating this depression and anxiety instead of wearily trying out yet another willpower hack, I have made more progress in being motivated in months than I had in the previous years.
Your point on a description of Harry's thinking is well-taken. I just had my brother submit this as a review, to err on the side of caution:
"With NickRoy's permission, I am submitting his solution, which I agree with, with additional evidence appended, just in case that is necessary; so consider this as superseding NickRoy's submission:
[the relevant text is here in the submission, but I don't need to repeat it in this comment]
Appended:
Harry does not know the full prophecy for certain, but he can guess it, based on: Harry's thought on star lifting in r...
Voldemort would be skeptical, yes, but he would also be interested, because "6. It is impossible to tell lies in Parseltongue" and because all this speech has to do is raise the risk enough that it makes more sense to stop and gather more information before killing Harry, thus it "allow[s] Harry to evade immediate death". What do you think would improve the believability?
Sure. Along with the centaur evidence, there's: Harry's thought on star lifting in response to this prophecy in Ch. 21, Harry noticing Quirrelmort's interest in the same prophecy in Ch. 86, Quirrelmort's talk of the stars' vulnerability to "sufficiently intelligent idiocy" in Ch. 95, Voldemort's "while the stars yet live" remark in Ch. 111, Voldemort's more explicit talk on the prophecy and his great fear of it in the next chapter, and how the Unbreakable Vow is framed in the most recent chapter. If Harry connects these dots, he'll have a good idea of what the full prophecy says.
Harry hisses "You have missinterpreted prophecy, to your great peril, becausse of power I have, but you know not. Yess, you are sstudying sscience, but, honesstly, you are yearss behind me. It may be that thiss power you know not iss ssomething I have at thiss sspecific time, that you will not know for too many yearss hence.
Before I explain, remember my Vow, and know my honesst intention not to desstroy the world, Vow or no. Now, do you know why I would tear apart the very sstarss? Do you know how? Not to desstroy the world, but to ssave it from what...
Paths of Glory (1957), film. Kirk Douglas vs. Moloch. An anti-war film, for reasons both usual and unusual.
Personally, I figure I'm not intelligent enough to research hard problems and I lack the social skills to be an activist, so by process of elimination the best path open to me for doing some serious good is making some serious money. Admittedly, some serious student loan debt also pushes me in this direction!
"Proving useful in your life" (but not necessarily "proving beneficial") is the core of instrumental rationality, but what's useful is not necessarily what's true, so it's important to refrain from using that metric in epistemic rationality.
Example: cognitive behavioral therapy is often useful "to solve problems concerning dysfunctional emotions", but not useful for pursuing truth. There's also mindfulness-based cognitive therapy for an example more relevant to Buddhism.
So, with a 60% chance of girlfriend breakup and a 90% chance of new partner acquisition, does this mean a 36% chance of a polyamorous, open, "cheating" or otherwise non-monogamous relationship situation for you at some point over the next year?
Edited to add: actually somewhat higher than 36%, since multiple new partners are possible along with a girlfriend breakup.
Interesting! What do you think a "bi" listing can signal? Openness to experience?
Edited for clarity. Also: I'm not complaining, but I am genuinely curious as to why this comment has been downvoted. Is this a sensitive topic?
I currently route around this by being an ethical egoist, though I admit that I still have a lot to learn when it comes to metaethics. (And I 'm not just leaving it at "I still have a lot to learn", either - I'm taking active steps to learn more, and I'm not just signalling that, and I'm not just signalling that I'm not signalling that, etc.)!
My thoughts on further social business opportunities: how about rationality consulting? If SI/LessWrong can establish enough credibility as rationalists this is worth money to both non-profit organizations and for-profit businesses, as well as potentially to consumers (as with Eliezer's rationality books). Rationality consulting would probably have to be done for free at first, of course. As a secondary benefit, this would also help with the ongoing effort to measure the impact rationality training has on an individual or an organization.
On a meta level, o...
Non-profit organizations like SI need robust, sustainable resource strategies. Donations and grants are not reliable. According to my university Social Entrepreneurship course, social businesses are the best resource strategy available. The Singularity Summit is a profitable and expanding example of a social business.
My question: is SI planning on creating more social businesses (either related or unrelated to the organization's mission) to address long-term funding needs?
By the way, I appreciate SI working on its transparency. According to my studies, tr...
What strikes me most about this post: the enthusiasm! I find it refreshing for this site and appropriate for this subject matter. Congratulations on successfully feeling rational, D_Malik.
Why not use several different methodologies on GiveWell, instead of just one, since there is some disagreement over methodologies? I can understand giving your favorite methodology top billing, of course (both because you believe it is best and it is your site and also to avoid confusion among donors), but there seems to be room for more than one.
True. It might be interesting to see if any hidden commonalities among Less Wrongians exist, however, if the "Other" option comes along with a "fill-in-the-blank" field. It might also be a good idea to include this "Other" option in addition to the other options to avoid everyone checking "Other".
I'm considering the possibility of an experimental treatment becoming available during those two months that could save the terminally ill patient from dying of that illness. Being alive would then allow the possibility of new life extension treatments, would could lead to a very long life indeed.
This would be a conjunction of possibilities, so I realize that the overall possibility of a terminally person transitioning to a very long-lived person is slim, but even a slim chance of living for a very long time is worth almost any degree of suffering. If no ...
A long-term goal of Less Wrong is to achieve the benefits of religion without becoming a religion. LW Meetups are partially an attempt at achieving a rationalist sense of community, for instance. Stanislav Petrov Day and Vasili Arkhipov Day are steps in the direction of rationalist rituals in general and rationalist holidays in specific. In addition to creating more holidays, I suggest that we figure out ways to celebrate them, rather than simply marking them.
I'll take a crack at it. A holiday celebrating existential risk reduction is a glorious opportuni...
Modern medicine, for keeping me sane. Without aspirin, my TMJ pain would be serious trouble.
You can thank ancient medicine (not modern) for the use of "aspirin" to treat joint pain. Using medicines derived from willow trees and other salicylate-rich plants for pain relief has been around since at least Hippocrates and probably even the Sumerians.
Yet, at least. Hypothetical example: I wonder if something like the Voluntary Human Extinction Movement will eventually switch out the "Voluntary" in favor of "Mandatory". But that's speculative, and you are right empirically.
Actually, I just had a chilling realization in regards to that. From chapter 62:
'"No," said the old wizard's voice. "I do not think so. The Death Eaters learned, toward the end of the war, not to attack the Order's families. And if Voldemort is now acting without his former companions, he still knows that it is I who make the decisions for now, and he knows that I would give him nothing for any threat to your family. I have taught him that I do not give in to blackmail, and so he will not try."
Harry turned back then, and saw a coldness ...
Not necessarily. As Wikipedia says, "According to the Great Filter hypothesis at least one of these steps - if the list were complete - must be improbable." That is, if "Great Filter" means anything, it's that one or more of the steps to achieving a technological civilization that can expand throughout the galaxy is very difficult ("improbable").
What I'm talking about goes like this: suppose that none of the steps are very difficult. Of course, that doesn't mean they're instantaneous - each step takes time. You need elements o...
I resolved my typical adolescent existential crisis (for the time being) in a somewhat atypical fashion, concluding after much deliberation that I ought to pause the crisis until I know what's True and what's not, which might mean pausing it forever.
How can I resolve an existential crisis without knowing what meaning, purpose, value, etc. Truly are? Rationality makes the most persuasive claim to the distillation of Truth, so I am an aspiring rationalist.
Taken.