Pfft comments on Take heed, for it is a trap - Less Wrong

47 Post author: Zed 14 August 2011 10:23AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (187)

You are viewing a single comment's thread.

Comment author: Pfft 14 August 2011 05:45:23PM 8 points [-]

If you have worked your way through most of the sequences you are likely to agree with the majority of these statements

I realize this is not the main point of the post, but this statement made me curious: what fraction of Less Wrong readers become convinced of these less mainstream beliefs?

To this end I made a Google survey! If you have some spare time, please fill it out. (Obviously, we should overlook the deliberately provocative phrasing when answering).

I'll come back two weeks from now and post a new comment with the results.

Comment author: Pfft 05 September 2011 02:24:21AM *  10 points [-]

Here are the crackpot belief survey results.

All in all, 77 people responded. It seems we do drink the Kool-Aid! Of the substantial questions, the most contentious ones were "many clones" and timeless physics, and even they got over 50%. Thanks to everyone who responded!


I want people to cut off my head when I'm medically dead, so my head can be preserved and I can come back to life in the (far far) future.
Agree 73% Disagree 27%

It is possible to run a person on Conways Game of Life. This would be a person as real as you or me, and wouldn't be able to tell he's in a virtual world because it looks exactly like ours.
Agree 90% Disagree 10%

Right now there exist many copies/clones of you, some of which are blissfully happy and some of which are being tortured and we should not care about this at all.
Agree 53% Disagree 47%

Most scientists disagree with this but that's just because it sounds counter-intuitive and scientists are biased against counterintuitive explanations.
Agree 32% Disagree 68%

Besides, the scientific method is wrong because it is in conflict with probability theory.
Agree 23% Disagree 77%

Oh, and probability is created by humans, it doesn't exist in the universe.
Agree 77% Disagree 23%

Every fraction of a second you split into thousands of copies of yourself.
Agree 74% Disagree 26%

Of course you cannot detect these copies scientifically, but that because science is wrong and stupid.
Agree 7% Disagree 93%

In fact, it's not just people that split but the entire universe splits over and over.
Agree 77% Disagree 23%

Time isn't real. There is no flow of time from 0 to now. All your future and past selves just exist.
Agree 53% Disagree 47%

Computers will soon become so fast that AI researchers will be able to create an artificial intelligence that's smarter than any human. When this happens humanity will probably be wiped out.
Agree 68% Disagree 32%

To protect us against computers destroying humanity we must create a super-powerful computer that won't destroy humanity.
Agree 70% Disagree 30%

Ethics are very important and we must take extreme caution to make sure we do the right thing.
Agree 82% Disagree 18%

Also, we sometimes prefer torture to dust-specs.
Agree 69% Disagree 31%

If everything goes to plan a super computer will solve all problems (disease, famine, aging) and turn us into super humans who can then go on to explore the galaxy and have fun.
Agree 79% Disagree 21%

the truth of all these statements is completely obvious to those who take the time to study the underlying arguments. People who disagree are just dumb, irrational, miseducated or a combination thereof.
Agree 27% Disagree 73%

I learned this all from this website by these guys who want us to give them our money.
Agree 66% Disagree 34%

Comment author: Baughn 14 August 2011 10:26:23PM 9 points [-]

I want to fill it out, I really do, but the double statements make me hesitate.

For example I do believe that there are ~lots of "clones of me" around, but I disagree that we shouldn't care about this. It has significant meaning when you're an average utilitarian, or something approaching one.

Comment deleted 15 August 2011 11:16:18PM [-]
Comment author: gwern 15 August 2011 11:31:29PM 3 points [-]

Well, to some extent, that's true. If a malicious god gave us a computer with infinite or nigh-infinite computing power, we could probably have AIXI up and running within a few days. Similar comments apply to brain emulation - things like the Blue Brain project indicate our scanning ability, poor as it may seem, is still way beyond our ability to run the scanned neurons.

Even if you don't interpret 'hardware problem' quite that generously, you still have an argument for hard takeoff - this is the 'hardware overhang' argument: if you prefer to argue that software is the bottleneck, then you have the problem that when we finally blunder into a working AI, it will be running on hardware far beyond what was needed for an intelligently-written AI.

So you're faced with a bit of a dilemma. Either hardware is the limit in which case Moore's law means you expect an AI soon and then quickly passing human with a few more cranks of the law, or you expect an AI much further out, but when it comes it'll improve even faster than the other kind would.

Comment author: Jack 17 August 2011 04:13:06AM 2 points [-]

I think this survey is a really good illustration of why degrees of belief are so helpful.