Do you have a picture of the poster that comes with a $40 pledge? Also, do you still get the poster if you pledge more?
I'm the kid in the corner with the laptop
Probably what I'll end up doing. Just checking first is all.
Not sure if open thread is the best place to put this, but oh well.
I'm starting at Rutgers New Brunswick in a few weeks. There aren't any regular meetups in that area, but I figure there have to be at least a few people around there who read lesswrong. If any of you see this I'd be really interested in getting in touch.
I suppose modafinil should be in the same boat as caffeine for the purposes of this experiment.
I cried twice reading this. That puts it just below Humanism part 3 on my list of most touching chapters.
I cried for real for the first time in years, and it made me very confused/uncomfortable with my feelings.
Quirrel in Methods has pretty much stated that he's trying to mold Harry into a dark lord. That requires Harry to be alive and is significantly more likely if he doesn't have Hermione's moral influence.
You will not be thrown in an asylum for discussing this with a professional
My experience disagrees. I went to see a professional for antidepressants, was emotionally stable at that moment, and was thrown in a psych ward for a week. I had to lie about my condition to be released. The whole affair failed to help in any way.
If my inhibitions regarding a certain course of action seem entirely internal, go through with it because I'm probably limiting my options for no good reason.
You would be correct. Thanks for the link.
Ah; sorry!
I'm planning to donate 30% of my pre-tax income this year, which I expect to be about $55K.
(Douglas_Knight's link also includes Julia's planned donations, which are $43K.)
Day = Made
How much money do you have to donate, if you don't mind my asking?
Kinda awkward to say aloud. I think Institute for the Research of Machine Intelligence would sound better. Minor nitpick.
Really? To me, the extra words in "Institute for the Research of Machine Intelligence" feel redundant, and MIRI is better for being concise and to the point.
This is a question about utilitarianism, not AI, but can anyone explain (or provide a link to an explanation) of why reducing the total suffering in the world is considered so important? I thought that we pretty much agreed that morality is based on moral intuitions and it seems pretty counterintuitive to value the states of mind of people too numerous to sympathize with as highly as people here do.
It seems to me that reducing suffering in a numbers game is the kind of thing you would say is your goal because it makes you sound like a good person, rather ...
You don't have to be specific, but how would grossing out the gatekeeper bring you closer to escape?
Like when you say "horrible, horrible things". What do you mean?
Driving a wedge between the gatekeeper and his or her loved ones? Threats? Exploiting any guilt or self-loathing the gatekeeper feels? Appealing to the gatekeeper's sense of obligation by twisting his or her interpretation of authority figures, objects of admiration, and internalized sense of honor? Asserting cynicism and general apathy towards the fate of mankind?
For all but the last one it seems like you'd need an in-depth knowledge of the gatekeeper's psyche and personal life.
But you wouldn't actually be posting it, you would be posting the fact that you conceive it possible for someone to post it, which you've clearly already done.
You really relish in the whole "scariest person the internet has ever introduced me to" thing, don't you?
Yes. Yes, I do.
Derren Brown is way better, btw. Completely out of my league.
Could you give me a hypothetical? I really can't imagine anything I could say that would be so terrible.
I'd prefer not to. If I successfully made my point, then I'd have posted exactly the kind of thing I said I wouldn't want to be known as being capable of posting.
Adult readers never seriously maintain that fictitious characters exist
A) "Never" is a strong word. I imagine there are all kinds of mental disorders that can lead certain adults to confuse fiction with reality
B) "Existence" here is a cached term used for simplifying a concept to the point of being inaccurate. When a person says that, for instance, Frodo Baggins doesn't exist, he or she would be entirely incorrect to say that there is nothing in existence that matches the concept of Frodo Baggins. What the person is actually saying, ...
I can't imagine anything I could say that would make people I know hate me without specifically referring to their personal lives. What kind of talk do you have in mind?
Have there been any interesting AI box experiments with open logs? Everyone seems to insist on secrecy, which only serves to make me more curious. I get the feeling that, sooner or later, everyone on this site will be forced to try the experiment just to see what really happens.
Only read "External" so far, but I propose god(s) be divided into "trusted and idealized authority figures", "internalized sense of commitment to integrity of respected and admirable reputation (honor)", and "external personification of inner conscience".
If people cite God as the source of spiritual value, it's because he represents a combination of these things and the belief that their values are ingrained in reality. God isn't the root cause, and taking Him out of the equation still leaves the relevant feelings an...
This post was from awhile ago and I don't think anyone with access to the note is still around to supply it. You could try asking everyone here for a copy and see if anything comes of it.
yes actually
This seems interesting. Are you just doing the whole thing through email? Also, voluntary response isn't a great way to get accurate results, but I guess it's all you have to work with.
I squeed when I saw this post and you should have shown the .mov series, everyone finds those funny.
Also, I don't think I can say that the root cause of climate change denial and cartoon hatedom is the exact same bias. With cartoons, people mostly reject them for fear of falling out of line with a vague but undeniably present cultural standard that could cause them grief in the future. With climate change, the issue has become so muddled in politics that clear lines have been drawn and to cross them would be labeled betrayal. Also, there are various non-sc...
72% probability of welcoming you to the herd
That's.... an interesting analysis. Can I ask whether you're speaking from experience, or is that too personal? If not, do you have any links for where you got you're information? I myself feel self destructive from time to time, and I think that's a pretty good description of the emotions involved, so I'm a bit curious here.
First two are from experience, second two are from anecdotes whose sources I mostly forget plus a dash of experience.
I don't have nearly as much experience with suicidal thoughts that can be interpreted as "wanting to die", but I can report that the standard "too much suffering to cope with" explanation isn't universal.
Freedom to make any sort of arrangement as long as all parties are willing. A "contract" would be a formal agreement. If you bring force into the mixture you'll end up with more problems than if you don't. You can't have everyone and their grandmother making arbitrary agreements and then using state power to coerce others into following through, so let them make arbitrary agreements and sort it out amongst themselves. Otherwise you get as much injustice as if you'd just allowed the government to dictate your affairs on a whim.
That still means he wanted to die, but the nature of his desire provokes extreme sympathy.
The psychology of suicide can get a lot more complicated than that. Feeling you absolutely must do something, but you can't bring yourself to lie down and wait, or to go to the hospital, or to call a hotline, or to take a shower, so you do the only thing you can. Watching yourself plan your own suicide, thinking "Huh. That's probably a bad idea. I wonder if I'll actually go through with it?". Being desperate both to die and to live, and picking whichever you're drifting towards until it happens to be death. Letting the suicidal part of you run the show, not because you share its goal, but because it's the only one that can get you out of bed.
It seems to me the problem here is that the private contracts would be enforced in the hypothetical model. Libertarians seem to propose that the legal benefits of marriage as opposed to the arbitrary spiritual components are the aspect of marriage to be agreed upon. I disagree.
I think that people should be allowed to create private contracts for any issue, but only if those contracts are not enforced. Both parties must remain willing participants throughout the process. Also, if the state deems any contracts unacceptably offensive, or contrary to public in...
I'm really not sure if the fact that he wanted to die makes it better or worse...
He didn't want to die, he couldn't handle going on living.
Good post, but I can easily imagine awesome ways to starve hundreds of children.
"Awesome" to me means impressive and exciting in a pleasant manner. You seem to use it to mean desirable. If morality just means desirability, then there's no reasons to use the word morality. I think that for morality to have any use, it has to be a component of desirability, but not interchangeable with it.
You posted this here just for an excuse to ask the poll, didn't you?
I think that this thread will go better, by the established norms of LW, if we stick to single, small topics that can actually be taken apart. The question you ask has far too many nested unknowns - definition of party platforms is hard, and economic outcomes of various policies is even harder - and too many places for discussion to go off the rails. Even with this group, that debate will devolve into talking points within three layers of replies. I'd rather have that sort of discussion in an ordinary group, and use LW for political debate of the sort LW actually has an advantage at.
I believe that this is the most relevant question we can ask if we're talking politics
Why ???
One of ways politics messes up people's reasoning is that they tend to pay excessive implication to the political alignment of issues. When considering policy X, they first ask themselves, "is X a left-wing or a right-wing policy"? (I know I used to, though I try to do so less and less), which in turn is likely to subconsciously influence how skeptical they are of pieces of evidence, etc.
This isn't a very big problem for elections, where one's vote has...
This question strikes me as both too mindkill-y, and as unimportant in light of the fact that you don't get to vote for political parties, only for individual candidates in individual races. What do you think would be the important change in your behavior if you were convinced, in general, that Republicans were "better" than Democrats or vice versa, and how do you think that would impact the political process?
A prerequisite to the above question:
Do the political parties, when elected, implement the policies that they advocate when campaigning?
Are there other affiliations besides party which more accurately predict a politician's actions?
I intend to live forever or die trying
-- Groucho Marx
Okay, thanks. That was really bothering me.
I felt an extreme Deja Vu when I saw the title for this.
I'm pretty sure I saw a post with the same name a couple of months ago. I don't remember what the post was actually about, so I can't really compare substance, but I have to ask. Did you post this before?
Again, sorry if this is me being crazy.
No, there was a very very similar post, about how governments are already super intelligences and seem to show no evidence of fooming.
This made me laugh. Also, I knew someone would do this the second Eliezer proposed new boundaries for Lesswrongers.
The only problem I can think of with this experiment is that your post could have been deleted for one of your more overt offenses, but it took until the time it was actually deleted for someone to actually get around to deleting it, especially with all the controversy. You have evidence that it was attacking Eliezer that broke the camel's back, but maybe not strong evidence. I don't think you can get anything conclusive from this.
What you sh...
I think you mean the Litany of Gendlin, and I believe some of these rules are being newly implemented, but I could be wrong about that.
He can run his site anyway he wants, and most of the ideas here are reasonable precautions given his values. That doesn't change the fact that I intuitively don't like them when I read them, and that gut reaction (or possibly it's opposite) is probably shared with others here who probably allow it to color their arguments one way or the other. Just something to keep in mind, is all.
Intuitive gut reaction. If I had an argument to make I would have said so. Any case I make would have been formed from backtracking from my initial feeling, and I'm probably not the only commenter here arguing based on an "ick" or "yay" gut reaction to the idea of censorship. I thought it was worth pointing out.
Well...
I'm upset by this.
Not sure why, exactly, but yeah, definitely upset by this. Just felt like sharing.
In other words, I don't think there's a fact of the matter about "if people should die after 100 years, a thousand years, or longer or at all". The question assumes that there's some single answer that works for everyone. That seems unlikely.
Not necessarily true. The question posits the existence of an optimal outcome. It just neglects to mention what, exactly, said outcome would be optimal to. It would probably be necessary to determine the criteria a system that accounts for immortality has to meet to satisfy us before we start coming up wi...
Superhappy aliens, FAI, United Nations... There are multiple possibilities. One is that you stay healthy for, say, 100 years, then spawn once blissfully and stop existing (salmon analogy). Humans' terminal values are adjusted in a way that they don't strive for infinite individual lifespan.
Possible outcome; better than most; boring. I don't think that's really something to strive for, but my values are not yours, I guess. Also, I'm assuming we're just taking whether an outcome is desirable into account, not its probability of actually coming about.
...I
I've never actually posted more than a comment here, so I'm all for the idea.
I don't know what to make of this. It means everything I'd pieced together about people is utterly, utterly wrong, because it assumed that they all valued truth, and understanding - the pursuits of intelligence when you don't have the political trait.
"Truth" and "understanding" seem to work as applause lights in this sentence. "Status" is used to the opposite effect throughout the post.
I think you're premise is a little confused. It sounds like you previously viewed status-seeking as the emotional equivalent of immoral, but...
Note: Not trying to attack your position, just curious.
but I cannot decide for sure if fixed lifespan is such a bad idea.
Fixed by whom, might I ask?
It seems to me that associating natural death of an individual with evil is one of those side effects of evolution humans could do without.
You seem to be implying that designed death is worse. How do you figure?
I'm not so sure I followed that. Do you still get tickets as long as you pledge $25 or higher? Or if you want the poster and a ticket do you have to make 2 pledges totaling $65?