Comment author: Relsqui 13 October 2010 07:26:29PM 2 points [-]

I'm sure that's true. The difference is that all that extra intelligence is tied up in a fallible meatsack; an AI, by definition, would not be. That was the flaw in my analogy--comparing apples to apples was not appropriate. It would have been more apt to compare a trowel to a backhoe. We can't easily parallelize among the excess intelligence in all those human brains. An AI (of the type I presume singulatarians predict) could know more information and process it more quickly than any human or group of humans, regardless of how intelligent those humans were. So, yes, I don't doubt that there's tons of wasted human intelligence, but I find that unrelated to the question of AI.

I'm working from the assumption that folks who want FAI expect it to calculate, discover, and reason things which humans alone wouldn't be able to accomplish for hundreds or thousands of years, and which benefit humanity. If that's not the case I'll have to rethink this. :)

Comment author: dilaudid 14 October 2010 12:00:09PM 1 point [-]

I agree FAI should certainly be able to outclass human scientists in the creation of scientific theories and new technologies. This in itself has great value (at the very least we could spend happy years trying to follow the proofs).

I think my issue is that I think it will be insanely difficult to produce an AI and I do not believe it will produce a utopian "singularity" - where people would actually be happy. The same could be said of the industrial revolution. Regardless, my original post is borked. I concede the point.

Comment author: Relsqui 13 October 2010 08:35:43AM 0 points [-]

That depends on what you're trying to accomplish. If you're not using your 200MHz machine because the things you want to work on require at least a gig of processing power, buying the new one might be very productive indeed. This doesn't mean you can't find a good purpose for your existing one, but if your needs are beyond its abilities, it's reasonable to pursue additional resources.

Comment author: dilaudid 13 October 2010 11:14:02AM 0 points [-]

Yeah I can see that applies much better to intelligence than to processing speed - one might think that a super-genius intelligence could achieve things that a human intelligence could not. Gladwell's Outliers (embarrassing source) seems to refute this - his analysis seemed to show that IQ in excess of 130 did not contribute to success. Geoffrey Miller hypothesised that intelligence is actually an evolutionary signal of biological fitness - in this case, intellect is simply a sexual display. So my view is that a basic level of intelligence is useful, but excess intelligence is usually wasted.

Comment author: Relsqui 13 October 2010 07:54:40AM *  10 points [-]

I have eight computers here with 200 MHz processors and 256MB of RAM each. Thus, it would not benefit me to acquire a computer with a 1.6GHz processor and 2GB of RAM.

(I agree with your premise, but not your conclusion.)

Comment author: dilaudid 13 October 2010 08:11:34AM *  1 point [-]

To directly address your point - what I mean is if you have 1 computer that you never use, with 200MHz processor, I'd think twice about buying a 1.6GHz computer, especially if the 200MHz machine is suffering from depression due to it's feeling of low status and worthlessness.

I probably stole from The Economist too.

Comment author: RichardKennaway 13 October 2010 07:43:36AM 3 points [-]

Did you have this in mind? Cognitive Surplus.

Comment author: dilaudid 13 October 2010 07:52:53AM 0 points [-]

Yes - thank you for the cite.

Comment author: dilaudid 13 October 2010 07:40:08AM *  19 points [-]

There is already a vast surplus of unused intelligence in the human race, so working on generalized AI is a waste of time (90%)

Edit: "waste of time" is careless, wrong and a bit rude. I just mean a working generalized AI would not make a major positive impact on humankind's well-being. The research would be fun, so it's not wasted time. Level of disagreement should be higher too - say ~95%.

Comment author: pricetheoryeconomist 07 May 2010 04:43:12PM 4 points [-]

I don't see this as a valid criticism, if it intended to be a dismissal. The addendum "beware this temptation" is worth highlighting. While this is a point worth making, the response "but someone would have noticed" is shorthand for "if your point was correct, others would likely believe it as well, and I do not see a subset of individuals who also are pointing this out."

Let's say there are ideas that are internally inconsistent or rational or good (and are thus not propounded) and ideas that are internally consistent or irrational or bad. Each idea comes as a draw from a bin of ideas, with some proportion that are good and some that are bad.

Further, each person has an imperfect signal on whether or not an idea is good or not. Finally, we only see ideas that people believe are good, setting the stage for sample selection.

Therefore, when someone is propounding an idea, the fact that you have not heard of it before makes it more likely to have been censored--that is, more likely to have been judged a bad idea internally and thus never suggested. I suggest as a bayesian update that, given you have never heard the idea before, it is more likely to be internally inconsistent/irrational/bad than if you hear it constantly, the idea having passed many people's internal consistency checks.

Comment author: dilaudid 15 May 2010 06:13:34PM 1 point [-]

Yes - this is exactly the point I was about to make. Another way of putting it is that an argument from authority is not going to cut mustard in a dialog (i.e. in a scientific paper, you will be laughed at if your evidence for a theory is another scientist's say so) but as a personal heuristic it can work extremely well. While people sometimes "don't notice" the 900 pound gorilla in the room (the Catholic sex abuse scandal being a nice example), 99% of the things that I hear this argument used for turn out to be total tosh (e.g. Santill's Roswell Alien Autopsy film, Rhine's ESP experiments). As Feynman probably didn't say, "Keep an open mind, but not so open that your brains fall out".

Comment author: jhuffman 26 January 2010 01:08:07PM *  5 points [-]

For a single individual the cost is much more than $300. Alcor's website says membership is $478 annually, plus another $120 a year if you elect the stand-by option. Also you need $150K worth of life insurance, which will add a bit more.

Peanuts! You say...

I really don't see the point of signing up now, because I really don't see how you can avoid losing all the information in your mind to autolysis unless you get a standby or at least a very quick (within an hour or two) vitrification. That means I have to be in the right place, at the right time when I die and I simply don't think thats likely now - when any death I experience would almost certainly be sudden and it would be hours and hours before I'm vitrified.

I mean, if I get a disease and have some warning then sure I'll consider a move to Phoenix and pay them their $20k surcharge (about a lifetime's worth of dues anyway) and pay for the procedure in cash up-front. There is no reason for me to put money into dues now when the net present value of those payments exceeds the surcharge they charge if you are a "last minute" patient.

I understand this isn't an option if you don't have at least that much liquidity but since I happen to do so then it makes sense to me to keep it all (and future payments) under my control.

Hopefully that decision is a long time from now and I'll be more optimistic about the whole business at that time. I'll also have better picture of my overall financial outlook and whether I'd rather spend that money on my children's future than my doubtful one.

In response to comment by jhuffman on Normal Cryonics
Comment author: dilaudid 01 February 2010 12:52:39PM 2 points [-]

jhuffman's point made me think of the following devil's advocacy: If someone is very confident of cryonics, say more than 99% confident, then they should have themselves preserved before death. They should really have themselves preserved immediately - otherwise there is a higher risk that they will die in a way that causes the destruction of their mind, than there is that cryonics will fail. The amount that they will be willing to pay would also be irrelevant - they won't need the money until after they are preserved. I appreciate that there are probably laws against preserving healthy adults, so this is strictly a thought experiment.

As people get older their risk of death or brain damage increases. This means that as someone gets older the confidence level at which they should seek early preservation will decrease. Also as someone gets older their expected "natural" survival time decreases, by definition. This means the payoff for not seeking early preservation is reducing all the time. This seems to bring some force to the argument - if there is a 10% probability that cryonics will succeed, then I really can't see why anyone would let themselves get within 6 years of likely death - they are putting a second lifetime at risk for 6 years of less and less healthy life.

Finally the confidence level relates to cost. If people can be shown to have a low level of confidence in cryonics, then their willingness to pay money should be lower. The figures I've seen quoted require a sum of $150,000. (Whether this is paid in life insurance or not is irrelevant - you must pay for it in the premium since, if you're going to keep the insurance until you die, the probability of the insurer paying out is 100%). If the probability of Cryonics working is 10%, then the average cost for a successful re-animation is $1.5 million. This is a pretty conservative cost I think - doubtless for some who read this blog it is small change. Not for me sadly though :)

Comment author: ciphergoth 13 December 2009 09:09:55AM 2 points [-]

What would the right thing look like? Averaging the log-odds ratio?

Comment author: dilaudid 13 December 2009 05:33:23PM 1 point [-]

That's what I would do. If one person is almost certain (say 1/(10^10^10)) then the strength of their view would be represented. Of course if anyone gives an irrationally low or high answer, or puts <=0 or >=1, then it overweights their views/blows up.

Comment author: dilaudid 13 December 2009 05:23:15PM 23 points [-]

Komponisto makes a strange assertion. The prior is not the reference that "someone would commit murder" - there is a body. A more appropriate prior is "someone who lives with someone who was murdered committed that murder" - I'm guessing that base probability is of the order of 0.1. Once we take into account that AK and MK aren't in a relationship, AK is female, and there is very strong evidence that someone else committed the murder then I'd agree that the probability drops, but these pieces of evidence don't cancel out leaving us with the original prior - the final probability may be higher or lower.

Also the "complexity penalty on the prosecution's theory of the crime is enormous" - that may mean the case was flawed, but it's not evidence she didn't kill MK unless you are willing to give some weight to the conviction (at <0.001, I assume you are not). Or to put it another way, even if the prosecution is completely wrong you cannot set the probability of guilt to 0. This is like assuming AK is guilty because her parents criticized the Italian legal system.

Overall I hope I am a bit more cautious about my abilities than you. In the first half you explain why you, as a human being, cannot be trusted to be rational. Then you set out your case. Why should I trust your rationality, but not others'?

Comment author: SilasBarta 06 December 2009 12:35:59AM *  3 points [-]

You're correct about Intrade's requirement to front the money to cover your position in all cases until the contract ends or your sell it.

However:

The current bids sum to quite a bit more than 100 so by selling contracts on every outcome you should receive more than 100 and you will never have to pay out more than 100 so you should have a guaranteed profit.

That doesn't follow. Even if the bids sum to more than a hundred, you have put up the other fraction of $10 for all of those n contracts. With a lot of the bids very low, then in order to cover all possibilities, you have to put up over $9 on many, and so it looks like you will have to front more than $10*(n-1), making it a loss from the beginning.

Yes, I ran the numbers in several cases like this in the '08 election.

Comment author: dilaudid 13 December 2009 03:52:46PM 0 points [-]

Horrible. If you can get access to it - use Betfair. It's probably blocked in the states though.

View more: Next