Indon
Indon has not written any posts yet.

If the first two shapes on the bottom are diamonds, why is the third shape a square?
That's a good way to clearly demonstrate a nonempathic actor in the Prisoner's Dilemma; a "Hawk", who views their own payoffs and only their own payoffs as having value and placing no value to the payoffs of others.
But I don't think it's necessary. I would say that humans can visualize a nonempathic human - a bad guy - more easily than they can visualize an empathic human with slightly different motives. We've undoubtedly had to, collectively, deal with a lot of them throughout history.
A while back I was writing a paper and came across a fascinating article about types of economic actors, and that paper concluded that there are probably three different... (read more)
Ah, so the statement is second-order.
And while I'm pretty sure you could replace the statement with an infinite number of first-order statements that precisely describe every member of the set (0S = 1, 0SS = 2, 0SSS = 3, etc), you couldn't say "These are the only members of the set", thus excluding other chains, without talking about the set - so it'd still be second-order.
Thanks!
Okay, my brain isn't wrapping around this quite properly (though the explanation has already helped me to understand the concepts far better than my college education on the subject has!).
Consider the statement: "There exists no x for which, for any number k, x after k successions is equal to zero." (¬∃x: ∃k: xS-k-times = 0, k>0 is the closest I can figure to depict it formally). Why doesn't that axiom eliminate the possibility of any infinite or finite chain that involves a number below zero, and thus eliminate the possibility of the two-sided infinite chain?
Or... is that statement a second-order one, somehow, in which case how so?
Edit: Okay, the gears having turned a bit further, I'd like to add: "For all x, there exists a number k such that 0 after k successions is equal to x."
That should deal with another possible understanding of that infinite chain. Or is defining k in those axioms the problem?
I would suggest that the most likely reason for logical rudeness - not taking the multiple-choice - is that most arguments beyond a certain level of sophistication have more unstated premises than they have stated premises.
And I suspect it's not easy to identify unstated premises. Not just the ones you don't want to say, belief-in-belief sort of things, but ones you as an arguer simply aren't sufficiently skilled to describe.
As an example:
For example: Nick Bostrom put forth the Simulation Argument, which is that you must disagree with either statement (1) or (2) or else agree with statement (3):
In the given summary (which may not accurately describe the full argument; for the purposes... (read more)
Perhaps, by sheer historical contingency, aspiring rationalists are recruited primarily from the atheist/libertarian/technophile cluster, which has a gender imbalance for its own reasons—having nothing to do with rationality or rationalists; and this is the entire explanation.
This seems immensely more likely than anything on that list. Libertarian ideology is tremendously dominated by white males - coincidentally, I bet the rationality community matches that demographic - both primarily male, and primarily caucasian - am I wrong? I'm not big into the rationalist community, so this is a theoretical prediction right here. Meanwhile, which of the listed justifications is equally likely to apply to both white females and non-white males?
Now, that's not to say the list of reasons has no impact. Just that the reason you dismissed, offhand, almost certainly dominates the spread, and the other reasons are comparatively trivial in terms of impact. If you want to solve the problem you'll need to accurately describe the problem.
I think that's an understatement of the potential danger of rationality in war. Not for the rationalist, mind, but for the enemy of the rationalist.
Most rationality, as elaborated on this site, isn't about impassively choosing to be a civilian or a soldier. It's about becoming less vulnerable to flaws in thinking.
And war isn't just about being shot or not shot with bullets. It's about being destroyed or not destroyed, through the exploitation of weaknesses. And a great deal of rationality, on this very site, is about how to not be destroyed by our inherent weaknesses.
A rationalist, aware of these vulnerabilities and wishing to destroy a non-rationalist, can directly apply their rationality to... (read more)
Speaking as a cat, there are a lot of people who would like to herd me. What makes your project higher-priority than everyone else's?
"Yes, but why bother half-ass involvement in my group?" Because I'm still interested in your group. I'm just also interested in like 50 other groups, and that's on top of the one cause I actually prefer to specialize with.
...It seems to me that people in the atheist/libertarian/technophile/sf-fan/etcetera cluster often set their joining prices way way way too high.
People in the atheist/libertarian/technophile/sf-fan/etc cluster obviously have a ton of different interests, and those interests are time/energy exclusive. Why shouldn't they have high requirements for yet another interest trying to add itself to the cluster?
Reading the article I can make a guess as to how the first challenges went; it sounds like their primary, and possibly only, resolution against the challenge was to not pay serious attention to the AI. That's not a very strong approach, as anyone in an internet discussion can tell you: it's easy to get sucked in and fully engaged in a discussion with someone trying to get you to engage, and it's easy to keep someone engaged when they're trying to break off.
Their lack of preparation, I would guess, led to their failure against the AI.
A more advanced tactic would involve additional lines of resolution after becoming engaged; contemplating philosophical arguments... (read more)
And what makes you sure of that? It even looks like the outline for the three boxes along the top.
Our cultural assumptions are perhaps more subtle than the average person thinks.