All of idlewire's Comments + Replies

So the real question is: "How will one's credibility be affected in the environment where the idea is presented?" which most likely depends on one's current credibility.

As of now, I don't have much karma so my risk of putting out poor ideas is more detrimental to this screen name. Eliezer could probably sneak in an entire subtly ludicrous paragraph that might go unnoticed for a while.

He has a history in reader's minds as well as the karma metric to make people ignore that flash in the back of their minds that something was off. They are more... (read more)

This kind of brings up the quality of thought that is spent on a subject. Someone with a strong ability to be self-criticizing can more effectively find flaws and come to better conclusions quicker. Those who contemplate on ideas with wrong, but unshakeable (or invisible rather) assumptions, will stew in poor circles until death. The idea of a comforting or powerful diety, unfortunately, sticks so hard when indoctrinated early and consistently.

While I'd have a difficult time pinning myself as either introvert or extrovert, I notice when I'm with a comfortable crowd, ideas will fall out of my mouth with so little processing that many sentences end with "... wait, nevermind, scratch that." I'll use my close aquaintences as easy parallel processing or to quickly look at ideas from obvious viewpoints that I tend to easily overlook.

When I'm in an unfamiliar group or setting, I'll often spend so long revising what I want to say that the conversation moves on and I've hardly said a word for 20 minutes.

This reminds me of an idea I had after first learning about the singularity. I assumed that once we are uploaded into a computer, a large percentage of our memories could be recovered in detail, digitized, reconstructed and categorized and then you would have the opportunity to let other people view your life history (assuming that minds in a singularity are past silly notions of privacy and embarrassment or whatever).

That means all those 'in your head' comments that you make when having conversations might be up for review or to be laugh at. Every now... (read more)

3BethMo
I want to read that story! Has anyone written it yet?

Assuming I understood this correctly, you're saying an true AI might find our morality as arbitrary as we would consider pebble heap sizes, say bugger the lot of us and turn us into biomass for its nano-furnace.

Could you not argue Occam's Razor from the conjunction fallacy? The more components that are required to be true, the less likely they are all simultaneously true. Propositions with less components are therefore more likely, or does that not follow?

2Richard_Kennaway
Propositions with more parts are not necessarily merely the conjunction of those parts. "A or B" and "A and B" may both be the same amount of complexity, by whatever measure, more than A.
1Regex
I was wondering this myself. I roughly knew of Solomonoff Induction as related... but apparently that is equivalent! The next thing my memory turned up was "Minimum Description Length" principle, which as it turns out... is also a version of Occam's Razor. Funny how that works. If we look at the original question again... "If two hypotheses fit the same observations equally well, why believe the simpler one is more likely to be true?" If I understand the conjunction fallacy correctly, it is strictly true that adding more propositions cannot increase the probability.That is to say, P( A & B) <= P(B)... and P( A & B) <= P(A). So the argument could be made that B might have probability one and therefore would be an equally probable hypothesis with its addition. So if you start with A, and B has probability less than one it will strictly lower the probability to include it. Thus as far as I can tell, Occam's Razor holds except where additional propositions have probability one. ...But if they have probability one, wouldn't they have to be axiomatically identical to just having proposition A? Or would it perhaps have to be probability one given A? I honestly don't know enough here, but I think the basic idea stands?

(Deuteronomy 13:7-11)

Talk about a successful meme strategy! No wonder we still have this religion today. It killed off its competitors.

With as scary as Anosognia sounds, we could be blocking out alien brain slugs for all we know.

We're really only just now able to identify these risks and start posing theoretical solutions to be attempted. Our ability to recognize and realistically respond to these threats is catching up. I think saying that we lack good self-preservation mechanisms is to criticize a little unfairly.

You wouldn't give up one IQ point for say 10 million dollars? It would be a painful decision, but I'm convinced I could have a much better effect on the world with a massive financial head start at only the slightest detriment of my intelligence. A large enough sum of money would afford me the chance to stop working and study and research the rest of my life, probably leading me to be more intelligent in the long run. Right now, I have to waste away my time with a superior level of intelligence just pay for food, shelter and student loans.

3Dmytry
That depends to where in testing I would lose it (or actually, doesn't depend because 10 millions is such a huge sum). If it just makes me think a little slower then who the hell cares, i'll save more time by having 10 millions. Likewise, as we gradually get dumber, having 10 millions allows to spend time working on important stuff while younger, so the iq-hours spend on the work may be larger.
9[anonymous]
Humans loose one point of IQ all the time and don't notice it. Cognitive decline with ageing, getting hit on the head, some medical conditions ect. Loosing 5 or 10 is however pretty noticeable.

Agreed. A lot of what we call intelligence is really speed - both in the short run (how long it takes you to add two numbers in your head, for instance) and in the longer run (how long it takes you to accomplish your ambitious projects). Ten million dollars would free up so much time and let you fake so much long-term speed that it would almost certainly be a gain if you got it for one IQ point. Not that anyone's actually offering this trade.