Comment author: mare-of-night 27 February 2014 02:32:39PM 2 points [-]

Actually, what I'd been getting at was that if they could convince me they were a better programmer than they really were, I could probably also be convinced that someone was more intelligent than they really were by similar means. If someone did convince me they were more intelligent than they really were, I'd have a harder time finding out I'd been fooled, for the same reason you mentioned.

I wouldn't take fooling me as a sign of all that much intelligence, though. I don't check the things people say about themselves for reasonableness very carefully. (Either not enough mental RAM, or force of habit from not having had enough RAM in the past - social interactions take more mental effort for me than for the typical person.)

Comment author: Chatham 27 February 2014 04:45:28PM 0 points [-]

Well, unless everyone is capable of fooling you, the ability to do so would seem to indicate at least some skill. I’m not sure of the intelligence conversion rate between “capability of deceiving you” and “capability of showing they’re better than you at programming in the particular class you share,” but your realization that the person is actually better at the former and not the latter seems to suggest the individual has a different set of skills, rather than merely being less skilled.

Comment author: mare-of-night 26 February 2014 02:11:15PM 2 points [-]

I'm probably more gullible than average, but I'm pretty sure that people less intelligent than me have done this when talking to me too. A few times, I've made an estimate that a fellow programming student is the same or slightly higher skill level than me based on talking with them, but then when we work on the same problems in class, I have an easier time than they do. (Not the same thing as intelligence, but related.)

Comment author: Chatham 26 February 2014 09:05:13PM 1 point [-]

You believe that you’re more intelligent than they are because you are able to do one task better than them (coding), yet it sounds like they were able to do another task better than you (being able to successful convince you that they were more intelligent). I’m not sure why the latter should be ruled out as a sign of intelligence.

Comment author: knb 26 February 2014 10:27:56AM 1 point [-]

I'm most worried about the fact that Kurzweil argued that AGI would be no threat to humans because we would "merge with the machines". He always left vague how he knew that would happen, and how he knew that would stop AI from being a threat.

Comment author: Chatham 26 February 2014 04:08:03PM 0 points [-]

Agreed, especially since, from what I’ve seen, Kurzweil’s reason for being so sanguine about Global Warming is exponential growth. He doesn’t seem to reflect on the problems that Global Warming is causing right now, or that the growth in renewables has come in a large part because of people who are concerned.

And the idea that we shouldn’t worry isn’t reassuring when it comes from someone who’s predictions of the future have mostly been incorrect. This is a man who stands by his predictions that by 2009, human musicians and cybernetic musicians would routinely play music together and that most text would come from voice recognition software, not keyboards. Anyone that takes him seriously should re-read that chapter with predictions for 2009 (which talks about 3D entertainment rooms, the growing popularity of computer authors, 3D art coming from computer artists being displayed on screens hung up on people’s houses, nanobots that think for themselves, the growing industry of creating the personalities for the artificial personas we routinely communicate with, etc.) and keep in mind that Kurzweil says his predictions were mostly accurate.

Comment author: advancedatheist 25 February 2014 03:33:23PM 13 points [-]

Figuring out a non-eugenics technology to raise IQ's would go a long way towards solving other problems. Nick Bostrom in one of this talks argues that raising everyone's IQ by ten points would revolutionize the world for the better, not by making the smartest people marginally smarter, but my "uplifting" billions of dullards above a threshold where they became more educable, more employable, more law abiding, more likely to save money and plan for the future and so forth.

Psychologist Linda Gottfredson of the University of Delaware would probably agree with this outcome:

http://www.udel.edu/educ/gottfredson/reprints/1997whygmatters.pdf

Comment author: Chatham 26 February 2014 02:55:23PM 0 points [-]

Good news. A recent study into incentives and IQ scores has shown that a monetary incentive of more than $10 can raise someone’s IQ score by 20 points. Looks like we can revolutionize the world pretty cheaply.

http://news.sciencemag.org/2011/04/what-does-iq-really-measure

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3093513/pdf/pnas.201018601.pdf

Comment author: knb 28 January 2014 07:44:47AM *  1 point [-]

I was somewhat concerned when Google hired Kurzweil because he comes across as very Pollyanna-ish in his popular writings.

Now they're buying a company founded by the guy who created this game.

Comment author: Chatham 26 February 2014 04:46:01AM 0 points [-]

The predictions in his popular writings have been pretty off base. More unsettling is the way he twists the words around to pretend they're accurate.