Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

Can you rewrite the last section in terms of "A" and "B" or something where appropriate, instead of "me" and "you", to make it less confusing? I almost get what you're trying to say, mostly, but I think it would clear up some confusion if the AIs talked about themselves in the third person and were clearly differentiated from me (i.e. the person writing this comment) and you (i.e. the person who wrote the post).

Thanks!

Ah, ok, I misunderstood you then. Sorry, and thanks for clearing that up.

I don't agree that religious organizations having something to say to rationalists about rationality is a bad thing -- they've been around much, much longer than rationalists have, and have had way more time to come up with good ideas. And the reason why they needed suggest it instead of working it out on their own is probably because of the very thing I was trying to warn against -- in general, we as a community tend to look at religious organizations as bad, and so tend to color everything they do with the same feeling of badness, which makes the things that are actually good harder to notice.

I also do not like being creeped out. But I assume the creepiness factor comes from the context (i.e. if the source of the staring thing was never mentioned, would it have been creepy to you?) But this is probably only doable in some cases and not others (the source of meditation is known to everyone) and I'm not entirely sure removing the context is a good thing to do anyways, if all we want to do is avoid the creepiness factor. I'll have to think about that. Being creeped out and deconstructing it instead of shying away is a good thing, and trains you to do it more automatically more often... but if we want the ideas to be accepted and used to make people stronger, would it not be best to state them in a way that is most acceptable? I don't know.

gscshoyru120

There is a category of religiously inspired posts that creep me out and set of cult alarms. It contains that post about staring from Scientology and that Transcendental meditation stuff that while I found it interesting and perhaps useful doesn't seem to belong on LW and now recently these Mormon posts abut growing organizations. shudder

What I'm about to say has been said before, but it bears repeating. What exactly about all this stuff is setting off cult alarms for you? I had a similar problem with those posts as well, until I actually went and questioned the cult alarm in my head (which was a gut reaction) and realized that it might not be a rational reaction. Just because some scary group does something does not make it a bad thing, even if they're the only people that do it -- reversed stupidity is not intelligence. And a number of those things suggested sound like good, self-improvement suggestions, which are free of religious baggage.

In general, when you're creeped out by something, you should try to figure out why you're being creeped out instead of merely accepting what the feeling suggests. Otherwise you could end up doing something bad that you wouldn't have done if you'd thought it through. Which is of course the basic purpose of the teachings on this site.

Sweet. A free (for me) way to donate money. Thank you very much for providing this opportunity (i.e. I praise you for your right action.)

You know what... I was missing the "look for a third option" bit. There are more options than the two obvious ones -- doing this, and not doing this.

I've been having trouble making myself do the rationalisty projects that I came up with for myself, and this article suggested a way to use group pressures to make me do these things. Since I really wanted a way to make myself do these projects, the article seemed like a really, really good idea. But instead of making the whole community do this, I can actually just ask some fellow rationalists at my meetup to do this, just to me. That way I can use group pressures to help impose my own rationally determined second order desires on myself. The only thing I think I lose this way is the motivation via a sense of community, where everyone else is doing it too...

Of course, this still doesn't resolve the problem of whether or not the community at large should adopt the ideas put forth in this article. I still can't seem to think rationally about it. But at least this is a way for me to get what I want without having to worry about the negative side effects of the whole community adopting this policy.

gscshoyru220

I think we should be a little careful of using the word "cult" as a mental stop sign, since that does seem to be what's happening here. We need to be a bit more careful about labeling something with all the bad connotations of a cult just because it has some of the properties of a cult -- especially if it only seems to have the good properties. But... that doesn't mean that this good cult property won't lead to the bad cult property or properties that we don't want. You should just be more explicit as to what and how, because I'm wavering back and forth on this article being a really, really good idea (the benefits of this plan are obvious!), and a really, really scary and bad idea (if I do it, it'll make me become part of a groupthinky monster!).

The problem I have is that both sides in my own head seem to be influenced by their own clear cognitive biases -- we have the cult attractor on one hand and the accidental negative connotations and stopsigny nature of the word "cult" on the other. So if you could semi-explicitly show why adopting the idea this article puts forth would lead to some specific serious negative consequences, that would clear up my own indecision and confusion.

This is doable... Let d be the length of the diameter of some circle, and c be the circumference of the same circle. Then if you have an integer number (m) of sticks of length d in a straight line, and an integer number (n) of sticks of length c in a different straight line, then the two lines will be of different lengths, no matter how you choose your circle, or how you choose the two integers m and n.

In general, if the axioms that prove a theorem are demonstrable in a concrete and substantive way, then any theorems proved by them should be similarly demonstrable, by deconstructing it into its component axioms. But I could be missing something.

There are sets of axioms that aren't really demonstrable in the physical universe, that mathematicians use, and there are different sets of axioms where different truths hold, ones that are not in line with the way the universe works. Non-euclidean geometry, for example, in which two parallel lines can cross. Any theorem is true only in terms of the axioms that prove it, and the only reason why we attribute certain axioms to this universe is because we can test them and the universe always works the way the axiom predicts.

For morality, you can determine right and wrong from a societal/cultural context, with a set of "axioms" for a given society. But I have no idea how you'd test the universe to see if those cultural "axioms" are "true", like you can for mathematical ones. I don't see any reason why the universe should have such axioms.

Maybe I've missed something in your original article or your comments, but I don't understand why you think a person in a perfect physics simulation of the universe would feel differently enough about the qualia he or she experiences to notice a difference. Qualia are probably a physical phenomenon, yes -- but if that physical phenomenon is simulated in exact detail, how can a simulated person tell the difference? Feelings about qualia are themselves qualia, and those qualia are also simulated by the physics simulator. Imagine for a moment, that some superbeing was able to determine the exact physical laws and initial conditions of this universe, and then construct a Turing machine that simulated our universe based on those rules and inital conditions. Or for argument's sake, imagine instead that the intial conditions plugged into the simulation were the state of the universe an hour before you wrote this article. At what point would the simulation and the real world diverge? If this world were the simulation, would the simulated you still have written this article? If so, then what's the difference between the "you" in the two universes? You've argued in your post that your experiences would be noticably different -- but if you're not acting on that difference, then what is this "you", exactly, and why can't it affect your actions? Or is there no such "you" -- and in which case, how would the simulated you differ from a "zombie"? And how do you know there is such a "you", here and now? If the simulated you would not have written this article -- well, then either there's something about qualia that can't be simulated, in which case qualia are not physical... or the physics simulation is imperfect, in which case it's not a perfect simulator by your definition, and if so why not?

Edit: scratch that. I was influenced by the first answer I saw on the page... my answer should not count. Leaving it for posterity, below -- I'd make it strikethrough if I knew how.

I suppose the correct value is probably 50 million.

Load More