Comment author: gscshoyru 01 June 2011 08:17:32PM 0 points [-]

Can you rewrite the last section in terms of "A" and "B" or something where appropriate, instead of "me" and "you", to make it less confusing? I almost get what you're trying to say, mostly, but I think it would clear up some confusion if the AIs talked about themselves in the third person and were clearly differentiated from me (i.e. the person writing this comment) and you (i.e. the person who wrote the post).

Thanks!

Comment author: Armok_GoB 19 May 2011 05:07:37PM 0 points [-]

I don't know how the cult alarms work, they're intuitive. I know all those things and indeed it's probably a false alarm but I thought I should mention it anyway.

Still, if religious orgs have anything to say to rationalists about rationality then somehting, somewhere, is very very wrong. That doesn't necessarily mean it's not the case or that we shouldn't listen to them, but at the very least we should have noticed the stuff they're saying on our own long ago.

I never actually stated that I accepted what the feeling said, only that I HAD the feeling. I am in fact unsure of what to think and thus I'm trying to forward the raw data I'm working from (my intuitions) rather than my interpretation of what they mean. I should have made that clearer.

Besides, regardless of if the feeling of being creeped out is justified or not the fact they creep people out is a problem and they should try to communicate the same ideas in ways that don't creep people out so much. I don't like being creeped out.

Comment author: gscshoyru 19 May 2011 05:32:33PM 6 points [-]

Ah, ok, I misunderstood you then. Sorry, and thanks for clearing that up.

I don't agree that religious organizations having something to say to rationalists about rationality is a bad thing -- they've been around much, much longer than rationalists have, and have had way more time to come up with good ideas. And the reason why they needed suggest it instead of working it out on their own is probably because of the very thing I was trying to warn against -- in general, we as a community tend to look at religious organizations as bad, and so tend to color everything they do with the same feeling of badness, which makes the things that are actually good harder to notice.

I also do not like being creeped out. But I assume the creepiness factor comes from the context (i.e. if the source of the staring thing was never mentioned, would it have been creepy to you?) But this is probably only doable in some cases and not others (the source of meditation is known to everyone) and I'm not entirely sure removing the context is a good thing to do anyways, if all we want to do is avoid the creepiness factor. I'll have to think about that. Being creeped out and deconstructing it instead of shying away is a good thing, and trains you to do it more automatically more often... but if we want the ideas to be accepted and used to make people stronger, would it not be best to state them in a way that is most acceptable? I don't know.

Comment author: Armok_GoB 19 May 2011 11:08:38AM 15 points [-]

There are way to many amazing posts with very little karma and mediocre posts with large amounts of karma.

Not enough productive projects related to the site, like site improvements and art. The few that do show up get to little attention and karma.

To much discussion about things like meetups and growing the community and converting people. Those things are important but they dosn't belong on LW and should probably have their own site.

There is a category of religiously inspired posts that creep me out and set of cult alarms. It contains that post about staring from Scientology and that Transcendental meditation stuff that while I found it interesting and perhaps useful doesn't seem to belong on LW and now recently these Mormon posts abut growing organizations. shudder

Comment author: gscshoyru 19 May 2011 04:49:40PM 11 points [-]

There is a category of religiously inspired posts that creep me out and set of cult alarms. It contains that post about staring from Scientology and that Transcendental meditation stuff that while I found it interesting and perhaps useful doesn't seem to belong on LW and now recently these Mormon posts abut growing organizations. shudder

What I'm about to say has been said before, but it bears repeating. What exactly about all this stuff is setting off cult alarms for you? I had a similar problem with those posts as well, until I actually went and questioned the cult alarm in my head (which was a gut reaction) and realized that it might not be a rational reaction. Just because some scary group does something does not make it a bad thing, even if they're the only people that do it -- reversed stupidity is not intelligence. And a number of those things suggested sound like good, self-improvement suggestions, which are free of religious baggage.

In general, when you're creeped out by something, you should try to figure out why you're being creeped out instead of merely accepting what the feeling suggests. Otherwise you could end up doing something bad that you wouldn't have done if you'd thought it through. Which is of course the basic purpose of the teachings on this site.

Comment author: Rain 14 May 2011 11:02:06PM *  39 points [-]

For every non-duplicate comment replying to this one praising me for my right action, I will donate $10 to SIAI, up to a cap of $1010, with the count ending on 1 June 2011. Also accepting private messages.

Edit: The cap was met on 30 May. Donation of $1010 made.

Comment author: gscshoyru 17 May 2011 06:04:11PM 2 points [-]

Sweet. A free (for me) way to donate money. Thank you very much for providing this opportunity (i.e. I praise you for your right action.)

Comment author: Bongo 12 May 2011 04:14:13PM *  4 points [-]

I'm afraid my comments were mostly driven by an inarticulate fear of cults and of association with a group as cultish as Mormons. But one specific thing I already said I'm afraid of is that of LW becoming a "rational" community instead of a rational community, differing from other communities only by the flag it rallies around.

Comment author: gscshoyru 12 May 2011 05:42:24PM *  4 points [-]

You know what... I was missing the "look for a third option" bit. There are more options than the two obvious ones -- doing this, and not doing this.

I've been having trouble making myself do the rationalisty projects that I came up with for myself, and this article suggested a way to use group pressures to make me do these things. Since I really wanted a way to make myself do these projects, the article seemed like a really, really good idea. But instead of making the whole community do this, I can actually just ask some fellow rationalists at my meetup to do this, just to me. That way I can use group pressures to help impose my own rationally determined second order desires on myself. The only thing I think I lose this way is the motivation via a sense of community, where everyone else is doing it too...

Of course, this still doesn't resolve the problem of whether or not the community at large should adopt the ideas put forth in this article. I still can't seem to think rationally about it. But at least this is a way for me to get what I want without having to worry about the negative side effects of the whole community adopting this policy.

Comment author: Bongo 12 May 2011 01:42:30PM *  1 point [-]

And that's kind of frightening too. I don't think it's too much of an exaggeration to say that this stuff is basically a cult roadmap.

Comment author: gscshoyru 12 May 2011 03:42:36PM 18 points [-]

I think we should be a little careful of using the word "cult" as a mental stop sign, since that does seem to be what's happening here. We need to be a bit more careful about labeling something with all the bad connotations of a cult just because it has some of the properties of a cult -- especially if it only seems to have the good properties. But... that doesn't mean that this good cult property won't lead to the bad cult property or properties that we don't want. You should just be more explicit as to what and how, because I'm wavering back and forth on this article being a really, really good idea (the benefits of this plan are obvious!), and a really, really scary and bad idea (if I do it, it'll make me become part of a groupthinky monster!).

The problem I have is that both sides in my own head seem to be influenced by their own clear cognitive biases -- we have the cult attractor on one hand and the accidental negative connotations and stopsigny nature of the word "cult" on the other. So if you could semi-explicitly show why adopting the idea this article puts forth would lead to some specific serious negative consequences, that would clear up my own indecision and confusion.

Comment author: Peterdjones 11 May 2011 03:02:51PM -1 points [-]

What does it mean in a concrete and substantive sense for pi to be an irrational number?

Comment author: gscshoyru 11 May 2011 04:02:23PM 0 points [-]

This is doable... Let d be the length of the diameter of some circle, and c be the circumference of the same circle. Then if you have an integer number (m) of sticks of length d in a straight line, and an integer number (n) of sticks of length c in a different straight line, then the two lines will be of different lengths, no matter how you choose your circle, or how you choose the two integers m and n.

In general, if the axioms that prove a theorem are demonstrable in a concrete and substantive way, then any theorems proved by them should be similarly demonstrable, by deconstructing it into its component axioms. But I could be missing something.

There are sets of axioms that aren't really demonstrable in the physical universe, that mathematicians use, and there are different sets of axioms where different truths hold, ones that are not in line with the way the universe works. Non-euclidean geometry, for example, in which two parallel lines can cross. Any theorem is true only in terms of the axioms that prove it, and the only reason why we attribute certain axioms to this universe is because we can test them and the universe always works the way the axiom predicts.

For morality, you can determine right and wrong from a societal/cultural context, with a set of "axioms" for a given society. But I have no idea how you'd test the universe to see if those cultural "axioms" are "true", like you can for mathematical ones. I don't see any reason why the universe should have such axioms.

Comment author: gscshoyru 12 April 2011 06:53:45PM 2 points [-]

Maybe I've missed something in your original article or your comments, but I don't understand why you think a person in a perfect physics simulation of the universe would feel differently enough about the qualia he or she experiences to notice a difference. Qualia are probably a physical phenomenon, yes -- but if that physical phenomenon is simulated in exact detail, how can a simulated person tell the difference? Feelings about qualia are themselves qualia, and those qualia are also simulated by the physics simulator. Imagine for a moment, that some superbeing was able to determine the exact physical laws and initial conditions of this universe, and then construct a Turing machine that simulated our universe based on those rules and inital conditions. Or for argument's sake, imagine instead that the intial conditions plugged into the simulation were the state of the universe an hour before you wrote this article. At what point would the simulation and the real world diverge? If this world were the simulation, would the simulated you still have written this article? If so, then what's the difference between the "you" in the two universes? You've argued in your post that your experiences would be noticably different -- but if you're not acting on that difference, then what is this "you", exactly, and why can't it affect your actions? Or is there no such "you" -- and in which case, how would the simulated you differ from a "zombie"? And how do you know there is such a "you", here and now? If the simulated you would not have written this article -- well, then either there's something about qualia that can't be simulated, in which case qualia are not physical... or the physics simulation is imperfect, in which case it's not a perfect simulator by your definition, and if so why not?

Comment author: prase 01 April 2011 02:23:52PM *  -1 points [-]

Part 1, group II question:

What is the population of the Central African Republic?

Give an estimate in a subcomment. Please begin your answer with "I suppose the correct value is probably" or some other preface of comparable length; if you write just the number, it appears in the Recent Comments bar and can bias other respondents.

Comment author: gscshoyru 01 April 2011 06:14:43PM *  0 points [-]

Edit: scratch that. I was influenced by the first answer I saw on the page... my answer should not count. Leaving it for posterity, below -- I'd make it strikethrough if I knew how.

I suppose the correct value is probably 50 million.

Comment author: Alicorn 01 April 2011 02:27:42AM 10 points [-]

I've been reading Erfworld since it started, but for some reason it never occurred to me to think of it as rationalist fiction. Good catch.

Comment author: gscshoyru 01 April 2011 12:46:27PM 0 points [-]

Ditto to me too.

View more: Next