All of Murska's Comments + Replies

Murska00

In this situation, I would shut down the AI, examine it to figure out if it did torture simulated copies of me and delete it entirely if it did or if I can't know with a high confidence. Threat of torture is bad, letting an UFAI free is worse. Actual torture is probably even worse, but luckily I get to choose before the experience.

-2TheAncientGeek
Please explain which part of the examination establishes that the copies of you are not zombies.
Murska00

True. However, there is no such thing as 'impossible', or probability 0. And while in common language people do use 'impossible' for what is merely 'very improbable', there's no accepted, specific threshold there. Your earlier point about people seeing a fake distinction between things that seem possible but unlikely in their model and things that seem impossible in their model contributes to that. I prefer to use 'very improbable' for things that are very improbable, and 'unlikely' for things that are merely unlikely, but it is important to keep in mind t... (read more)

Murska00

Assuming you mean that things you believe are merely 'unlikely' can actually, more objectively, be less likely than things you believe are outright 'impossible', then I agree.

1Eugine_Nier
What I mean is that the conjunction of possible events will be perceived as unlikely, even if enough events are conjoined together to put the probability below what the threshold for "impossible" should be.
Murska20

I am confused now. Did you properly read my post? What you say here is 'I disagree, what you said is correct.'

To try and restate myself, most people use 'unlikely' like you said, but some, many of whom frequent this site, use it for 'so unlikely it is as good as impossible', and this difference can cause communication issues.

2Eugine_Nier
My point is that in common usage (in other words from the inside) they distinction between "unlikely" and "impossible" doesn't correspond to any probability. In fact there are "unlikely" events that have a lower probability than some "impossible" events.
Murska00

If I understand you correctly, then I agree. However, to me it seems clear that human beings discount probabilities that seem to them to be very small, and it also seems to me that we must do that, because calculating them out and having them weigh our actions by tiny amounts is impossible.

The question of where we should try to set the cut-off point is a more difficult one. It is usually too high, I think. But if, after actual consideration, it seems that something is actually extremely unlikely (as in, somewhere along the lines of 10^{-18} or whatever), ... (read more)

0Eugine_Nier
Disagree. Most people use "unlikely" for something that fits their model but is unlikely, e.g., winning the lottery, having black come up ten times in a row in a game of roulette, two bullets colliding in mid air. "Untrue" is used for something that one's model says is impossible, e.g, Zeus or ghosts existing.
Murska00

I would say the probability of Yellowstone or meteor impact situation are both vastly higher than something like the existance of a specific deity. They're in the realm of possibilities that are worth thinking about. But there are tons of other possible civilization-ending disasters that we don't, and shouldn't, consider, because they have much less evidence for them and thus are so improbable that they are not worth considering. I do not believe we as humans can function without discounting very small probabilities.

But yeah, I'm generally rather optimisti... (read more)

1Eugine_Nier
Careful there. Our intuition of what's in the "realm of possibilities that are worth thinking about" doesn't correspond to any particular probability, rather it is based on whether the thing is possible based on our current model of the world and doesn't take into account how likely that model is to be wrong.
Murska00

Yes, that is probably clear to most of us here. But, in reality, I and most likely also you discount probabilities that are very small, instead of calculating them out and changing our actions (we'll profess 'this is very unlikely' instead of 'this is not true', but what actually happens is the same thing). There's a huge amount of probability 10^{-18} deities out there, we just shrug and assume they don't exist unless enough strong (or 'good', I still don't see the difference there) evidence comes up to alter that probability enough so that it is in the r... (read more)

0ChristianKl
Given that someone like Richard Kennaway who's smart and exposed to LW thinking (>10000 karma) doesn't immediately find the point I'm making obvious, you are very optimistic. People usually don't change central beliefs about ontology in an hour after reading a convincing post on a forum. A hour might be enough to change the language you use, but it's not enough to give you a new way to relate to reality. The probability that an asteroid destroys humanity in the next decade is relatively small. On the other hand it's still useful for our society to invest more resources into telescopes to have all near-earth objects covered. The same goes for Yellowstone destroying our civilisation. Our society is quite poor at dealing with low probability high impact events. If it comes to things like Yellowstone the instinctual response of some people is to say: "Extraordinary claims require extraordinary evidence." That kind of thinking is very dangerous given that human technology get's more and more powerful as time goes on.
Murska100

It looks to me to be rather clear that what is being said ("myths are not evidence for Zeus") translates roughly to "myths are very weak evidence for Zeus, and so my beliefs are changed very little by them". Is there still a real misunderstanding here?

-1ChristianKl
You are making a mistake in reasoning if you don't change your belief through that evidence. Your belief should change by orders of magnitude. A change from 10^{-18} to 10^{-15} is a strong change. The central reason to believe that Zeus doesn't exist are weak priors. Skeptics have ideas that someone has to prove something to them for them to believe it. In the Bayesian worldview you always have probabilities for your beliefs. Social obligations aren't part of it. "Good" evidence means that someone fulfilled a social obligation of providing a certain amount of proof. It doesn't refer to how strongly a Bayesian should update after being exposed to a piece of evidence. There are very strong instincts for humans to either believe X is true or to believe X is false. It takes effort to think in terms of probabilities.
Murska00

Got me to register, this one. I was curious about my own reaction, here.

See, I took in the problem, thought for a moment about game theory and such, but I am not proficient in game theory. I haven't read much of it. I barely know the very basics. And many other people can do that sort of thinking much better than I can.

I took a different angle, because it should all add up to normality. I want to save human lives here. For me, the first instinct on what to do would be to cooperate on the first iteration, then cooperate on the second regardless of whether ... (read more)

Murska30

Hello.

I'm 21, from Finland. Studying physics, right now. I've felt for my entire life that that has been the path I want to take, and even after significant soul-searching lately on whether I really do want to go for it or not, partially sparked by reading LW, I still haven't changed my mind thus far.

I've read quite a bit of the Sequences and various other posts, mostly because many of the topics are very interesting (though I've found that I am interested in a lot of things), some of them affirming to my previous views and others disillusioning. It feels ... (read more)