entirelyuseless

Wiki Contributions

Comments

Sorted by

"But of course the claims are separate, and shouldn't influence each other."

No, they are not separate, and they should influence each other.

Suppose your terminal value is squaring the circle using Euclidean geometry. When you find out that this is impossible, you should stop trying. You should go and do something else. You should even stop wanting to square the circle with Euclidean geometry.

What is possible, directly influences what you ought to do, and what you ought to desire.

Nope. There is no composition fallacy where there is no composition. I am replying to your position, not to mine.

I do care about tomorrow, which is not the long run.

I don't think we should assume that AIs will have any goals at all, and I rather suspect they will not, in the same way that humans do not, only more so.

Not really. I don't care if that happens in the long run, and many people wouldn't.

I considered submitting an entry basically saying this, but decided that it would be pointless since obviously it would not get any prize. Human beings do not have coherent goals even individually. Much less does humanity.

Right. Utilitarianism is false, but Eliezer was still right about torture and dust specks.

Can we agree that I am not trying to prosthelytize anyone?

No, I do not agree. You have been trying to proselytize people from the beginning and are still doing trying.

(2) Claiming authority or pointing skyward to an authority is not a road to truth.

This is why you need to stop pointing to "Critical Rationalism" etc. as the road to truth.

I also think claims to truth should not be watered down for social reasons. That is to disrespect the truth. People can mistake not watering down the truth for religious fervour and arrogance.

First, you are wrong. You should not mention truths that it is harmful to mention in situations where it is harmful to mention them. Second, you are not "not watering down the truth". You are making many nonsensical and erroneous claims and presenting them as though they were a unified system of absolute truth. This is quite definitely proselytism.

I basically agree with this, although 1) you are expressing it badly, 2) you are incorporating a true fact about the world into part of a nonsensical system, and 3) you should not be attempting to proselytize people.

Nothing to see here; just another boring iteration of the absurd idea of "shifting goalposts."

There really is a difference between a general learning algorithm and specifically focused ones, and indeed, anything that can generate and test and run experiments will have the theoretical capability to control pianist robots and scuba dive and run a nail salon.

Do you not think the TCS parent hasn't also heard this scenario over and over? Do you think you're like the first one ever to have mentioned it?

Do you not think that I am aware that people who believe in extremist ideologies are capable of making excuses for not following the extreme consequences of their extremist ideologies?

But this is just the same as a religious person giving excuses for why the empirical consequences of his beliefs are the same whether his beliefs are true or false.

You have two options:

1) Embrace the extreme consequences of your extreme beliefs. 2) Make excuses for not accepting the extreme consequences. But then you will do the same things that other people do, like using baby gates, and then you have nothing to teach other people.

I should have said also that the stair-falling scenario and other similar scenarios are just excuses for people not to think about TCS.

You are the one making excuses, for not accepting the extreme consequences of your extremist beliefs.

Load More