If you have "something to protect", if your desire to be rational is driven by something outside of itself, what is the point of having a secret identity? If each student has that something, each student has a reason to learn to be rational -- outside of having their own rationality dojo someday -- and we manage to dodge that particular failure mode. Is having a secret identity a particular way we could guarantee that each rationality instructor has "something to protect"?
Sheridan: "What do you want?" Kosh: "Never ask that question!"
People are like dogs, they just sort of do things arbitrarily. If you look beyond the smoke and mirrors of your surface preferences, all you're going to find behind them is more smoke and mirrors. A wise man once suggested to me that I should just treat my brain as an oracle for preferences - give it as good a data as I can, and as much processing power as it needs, and just take what it spits out as gospel, rather than seeking the underlying principles.
But don't you want to understand the underlying principles?
It seems that in order to get Archimedes to make a discovery that won't be widely accepted for hundreds of years, you yourself have to make a discovery that won't be widely accepted for hundreds of years; you have to be just as far in the dark as you want Archimedes to be. So talking about plant rights would probably produce something useful on the other end, but only if what you say is honestly new and difficult to think about. If I wanted Archimedes to discover Bayes' theorem, I would need to put someone on the line who is doing mathematics that is hundreds of years ahead of their time, and hope they have a break-through.
This is silly.
I think that perhaps you may be missing the point.
I'm thinking about why I care about why I care about what I'm thinking, and I'm realizing that I have other things that I need to do, and that realization is not helping me get past this moment.
One: I support the above post. I've seen quite a few communities die for that very reason.
Two: Gurren Lagann? (pause) Gurren Lagann? Who the h*ll do you think I am?
Post in this thread if you live in the midwestern USA or nearby areas of Canada, ideally roughly within a day's drive of Chicago.
EDIT: For anyone in this area, Penguicon may be a good location for a meetup. It's a mixed sci-fi/open-source/general-geekery convention in the Detroit area, and just might possibly have at least one guest that LW readers would be interested to meet. I probably won't be there this year, though.
I used to live in Ann Arbor, rather recently. I live in Saginaw now.
This post seems too vague to be useful.
I just got done re-reading Stephen Pinker's book How the Mind Works and seeing the phrase "largely circumstantial" in this post reminded me of Pinker's discussion of the so-called "nature-nurture issue." He points out that it's absurd to think that because nature is important, nurture doesn't matter, but he compares the statement "nature and nurture are both important" to statements like, "The behavior of a computer comes from from a complex interaction between the processor and the input," which is "true but useless."
I feel the same way about statements like "more is possible." I understand the desire to be inspirational, but my brain is objecting too much. How much more is possible? Under what circumstances? etc.
I believe the point is that we do not know how much more is possible, or what circumstances make that so. As such, we must check, as often as we can, to make absolutely sure that we are still held by our chains.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Failure mode: My "something to protect" is to spread rationality throughout the world and to raise the sanity waterline, which is best achieved by having my own rationality dojo.
Beware the meta.
I agree. I think that failure mode might then be better avoided by restricting possible "somethings", as opposed to adding another requirement on to one's reasons for wanting to be rational.