All of bluej100's Comments + Replies

It seems to me that a good model of the great recession should include as its predictions that male employment would be particularly hard-hit even among recessions (see https://docs.google.com/spreadsheet/ccc?key=0AofUzoVzQEE5dFo3dlo4Ui1zbU5kZ2ZENGo4UGRKbFE#gid=0). I think this probably favors ZMP (see http://marginalrevolution.com/marginalrevolution/2013/06/survey-evidence-for-zmp-workers.html). Edit: after normalizing the data with historical context, I'm not so sure.

Or until the supply of low-skill workers depress the remaining low-skill wage beneath minimum wage/outsourcing. I think that we are eliminating a larger proportion of low-skill jobs per year than we ever have before, but I agree that the retraining and regulation issues you pointed out are significant.

4Halfwitz
Well, there's an obvious solution for that.

Yeah, exactly. Especially if you take Cowen's view that science requires increasing marginal effort.

"There's a thesis (whose most notable proponent I know is Peter Thiel, though this is not exactly how Thiel phrases it) that real, material technological change has been dying."

Tyler Cowen is again relevant here with his http://www.amazon.com/The-Great-Stagnation-Low-Hanging-ebook/dp/B004H0M8QS , though I think he considers it less cultural than Thiel does.

"We only get the Hansonian scenario if AI is broadly, steadily going past IQ 70, 80, 90, etc., making an increasingly large portion of the population fully obsolete in the sense that there... (read more)

300 IQ is 10 standard deviations above the mean. So picture a trillion planets each with a trillion humans on them and take the smartest person out of all of this and transport him to our reality and make it very easy for him to quickly clone himself. Do you really think it would take this guy five full years to dominate scientific output?

7Eliezer Yudkowsky
Plenty of low-wage jobs have been automated away by machines over the last four centuries. You don't end up permanently irrevocably unemployed until all the work you can do has been automated away.
6EHeller
I would estimate even longer- a lot of science's rate limiting steps involve simple routine work that is going to be hard to speed up. Think about the extreme cutting edge- how much could an IQ-300 AI speed up the process of physically building something like the LHC?
bluej100200

The quine requirement seems to me to introduce non-productive complexity. If file reading is disallowed, why not just pass the program its own source code as well as its opponent's?

4AlexMennen
That's a good point. I've already got a few submissions, but on the other hand, I could notify them of the change, and it would only require a trivial modification. Is there a consensus on whether I should do this anyway?
6darius
Yes -- in my version of this you do get passed your own source code as a convenience.
bluej100110

I think Eliezer's "We have never interacted with the paperclip maximizer before, and will never interact with it again" was intended to preclude credible binding.

I'll reply two years later: Light drinking during pregnancy is associated with children with fewer behavioral and cognitive problems. This is probably a result of the correlation between moderate alcohol consumption and iq and education, but it's interesting nonetheless.

Steven Brams has devised some fair division algorithms that don't require good will: see his surplus procedure ( http://en.wikipedia.org/wiki/Surplus_procedure ) and his earlier adjusted winner procedure ( http://en.wikipedia.org/wiki/Adjusted_Winner_procedure ).

I just read the RSS feed for a Yudkowsky fix since he left Overcoming Bias.