All of alexg's Comments + Replies

alexg20

I can't believe that this one hasn't been done before:

Unless you are Eliezer Yudkowsky, there are 3 things that are certain in life: death, taxes and the second law of thermodynamics.

alexg00

Here's a very fancy cliquebot (with a couple of other characteristics) that I came up with. The bot is in one of 4 "modes" -- SELF, HOSTILE, ALLY, PASSWORD.

Before turn1:

Simulate the enemy for 20 turns against DCCCDDCDDD Cs thereafter, If the enemy responds 10Ds, CCDCDDDCDC then change to mode SELF. (These are pretty much random strings -- the only requirement is that the first begins with D)

Simulate the enemy for 10 turns against DefectBot. If the enemy cooperates in all 10 turns, change to mode HOSTILE. Else be in mode ALLY.

In any given turn,

If... (read more)

alexg00

Possibly I used it out of context, What I mean is that utility (less crime)> utility(society has inaccurate view of justice system) when the latter has few other consequences, and rationaliy is about maximising utility. Also, in the Least Convenient World, overall this trial will not affect any others, hence negating the point about the accuracy of the justice system. Here knowledge is not an end, it is a means to an end.

2Richard_Kennaway
See my reply to Roxolan.
alexg40

G'day

As you can probably guess, I'm Alex. I'm a high school student from Australia and have been disappointed with the education system here from quite some time.

I came to LW via HPMoR which was linked to me by a fellow member of the Aus IMO team. (I seriously doubt I'm the only (ex-)Olympian around here - seems just the sort of place that would attract them). I've spent the past few weeks reading the sequences by EY, as well as miscellaneous other stuff. Made a few (inconsequential) posts too.

I have very little in the way of controversial opinions to off... (read more)

4Vaniver
Welcome! There have been previous political threads, like here, here, or here. If you search "politics," you'll find quite a bit. Here was my response to the proposal that we have political discussion threads; basically, I think politics is a suboptimal way to spend your time. It might feel useful, but that doesn't mean it is useful. Here's Raemon's comment on the norm against discussing politics. Explicitly political discussion can be found on MoreRight, founded by posters active on LessWrong, as well as on other blogs. (MoreRight is part of 'neoreaction', which Yvain has recently criticized here, for example.) I don't see what you mean by the 'pros and cons' of holding a particular ideology. Ideologies are, generally, value systems- they define what is a pro and what is a con.
alexg00

You're kind of missing the point here. I probably should have clarified my position more The reason I want people to trust the justice system is so that people wil not be inclined to commit crimes, because it would then more likely (from their point of view) that, if they did, they would get caught. I suppose there is the issue of precedent to worry about, but the ultimate purpose of the justice system, from the consequentialist viewpoint, is to deter crimes (by either the offender it is dealing with or potential others), not to punish criminals. As the o... (read more)

1Richard_Kennaway
This ignores the causal relationships. How is punishing the innocent supposed to create a stabler society? Because, in your scenario, it's just this once and no-one will ever know. But it's never just this once, and people (the judge, X, and Y at least) will know. As one might observe from a glance at the news from time to time. All you're doing is saying, "But what if it really was just this once and no-one would ever know?" To which the answer is, "How will you know?" To which the LCPW replies "But what if you did know?", engulfing the objection and Borgifying it into an extra hypothesis of your own. You might as well jump straight to your desired conclusion and say "But what if it really was Good, not Bad?" and you are no longer talking about anything in reality. Reality itself is the Least Convenient Possible World.
0Richard_Kennaway
I don't think you understand what "rationality is about winning" means. It is explained here, here, and here.
alexg00

Test for Consequentialism:

Suppose you are a judge in deciding whether person X or Y commited a murder. Let's also assume your society has the death penalty. A supermajority of society (say, encouraged by the popular media) has come to think that X committed the crime, which would decrease their confidence in the justice system if he is set free, but you know (e.g. because you know Bayes) that Y was responsible. We also assume you know that Y won't reoffend if set free because (say) they have been too spooked by this episode. Will you condemn X or Y? (Befor... (read more)

2Richard_Kennaway
By condemning X, I uphold the people's trust in the justice system, while making it unworthy of that trust. By condemning Y, I reduce the people's trust in the justice system, while making the system worthy of their trust. But what is their trust worth, without the reality that they trust in? If I intend the justice system to be worthy of confidence, I desire to act to make it worthy of confidence. If I intend it to be unworthy of confidence, I desire to act to make it unworthy of confidence. Let me not become unattached to my desires, nor attached to what I do not desire. Also, there is no Least Convenient Possible World. The Least Convenient Possible World for your interlocutors is the Most Convenient Possible World for yourself, the one where you get to just say "Suppose that such and such, which you think is Bad, were actually Good. Then it would be Good, wouldn't it?"
alexg20

I'm not sure if anyone's noticed this, but how do you know that you're not a simulation of yourself inside Omega? If he is superintelligent, he would compute your decision by simulating you, and you and your simulation will be indisinguishable.

This is fairly obviously a PD against said simulation -- if you cooperate in PD, you should one-box.

I personally am not sure, although if I had to decide I'd probably one-box