iceman comments on Open thread, Jan. 25 - Jan. 31, 2016 - Less Wrong

3 Post author: username2 25 January 2016 09:07PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (169)

You are viewing a single comment's thread. Show more comments above.

Comment author: Dagon 25 January 2016 11:41:01PM 1 point [-]

Hmm. review scared me a bit, and the home page talking about incredibly nearsighted populist economics is a huge turn-off. Still, probably need to read it.

Is the kindle version different in any way from the free mobi file? I'll gladly spend $5 for good formatting or easier reading, but would prefer not to pay Amazon if they're not providing value.

Comment author: iceman 26 January 2016 12:52:35AM *  1 point [-]

What's wrong with the economics on the home page? It seems fairly straightforward and likely. Mass technological unemployment seems at least plausible enough to be raised to attention. (Also.)

Comment author: Dagon 26 January 2016 01:31:43AM 1 point [-]

It (and your link) treat "employment" as a good. This is ridiculous - employment is simply an opportunity to provide value for someone. Goods and services becoming cheap doesn't prevent people doing things for each other, it just means different things become important, and a larger set of people (including those who are technically unemployed) get more stuff that's now near-free to create.

Comment author: jacob_cannell 26 January 2016 05:42:52AM 4 points [-]

Goods and services becoming cheaper is basically the economists definition of progress, so that's all good.

a larger set of people (including those who are technically unemployed) get more stuff that's now near-free to create.

There is no natural law which ensures that everyone has earnings potential greater than cost of living. New tech isn't making food or housing cheaper fast enough, and can't be expected to in the future. AI could suddenly make most of the work force redundant without making housing or food free.

Comment author: TheAncientGeek 27 January 2016 04:24:35PM 3 points [-]

There is no natural law which ensures that everyone has earnings potential greater than cost of living.

Indeed not, but that correct idea often leads people to the incorrect idea that robotics-induced disemployment, and subsequent impoverishment, are technological inevitabilities. Whether people everybody is going to have enough income to eat depends on how the (increased) wealth of such a society is distributed .. basically to get to the worst-case scenario, you need a sharp decline of interest in wealth redistribution, even compared to US norms. It's a matter of public policy, not technological inevitability. So it's not really the robots taking over people should be afraid of, it's the libertarians taking over.

New tech isn't making food or housing cheaper fast enough,

I am not sure what that is supposed to mean. There is enough food and living space to go round, globally, but it is not going to everyone who needs its, which is, again, re/distribution problem

Comment author: Lumifer 26 January 2016 03:53:59PM *  2 points [-]

New tech isn't making food or housing cheaper fast enough, and can't be expected to in the future

First, what's "fast enough"? Look up statistics of what fraction of income did an average American family spend on food a hundred years ago and now.

Second, why don't you expect it in the future? Biosynthesizing food doesn't seem to be a huge problem in the context that includes all-powerful AIs...

Comment author: jacob_cannell 26 January 2016 05:15:54PM 2 points [-]

First, what's "fast enough"?

Fast enough would be moore's law - price of food falling by 2x every couple of years. Anything less than this could lead to biological humans becoming economically unviable, even as brains in vats.

Look up statistics of what fraction of income did an average American family spend on food a hundred years ago and now.

Like this?

Second, why don't you expect it in the future? Biosynthesizing food doesn't seem to be a huge problem in the context that includes all-powerful AIs...

Biosynthesized food is an extremely inefficient energy conversion mechanism vs say solar power. Even in the ideal case, the human body burns about 100 watts. When AGI becomes more power efficient than that, even magical 100% efficient solar->food isn't enough for humans to be competitive. When AGI requires less than 10 watts, even human brains in vatts become uncompetitive.

A future of all-powerful AIs is the future where digital intelligence becomes more efficient than biological. So the only solution there where humans remain competitive involve uploading.

Comment author: Lumifer 26 January 2016 05:25:14PM *  2 points [-]

price of food falling by 2x every couple of years. Anything less than this could lead to biological humans becoming economically unviable,

Why so? Human populations do not double every couple of years.

When AGI requires less than 10 watts, even human brains in vatts become uncompetitive.

Hold on. We're not talking about competition between computers and humans. You said that in the future there will not be enough food for all (biological) humans. That has nothing to do with competitiveness.

Comment author: gjm 26 January 2016 07:07:02PM 2 points [-]

We're not talking about competition between computers and humans. You said that in the future there will not be enough food for all (biological) humans.

I think you are misremembering the context. Here's the first thing he said on the subject:

There is no natural law which ensures that everyone has earnings potential greater than cost of living. New tech isn't making food or housing cheaper fast enough, and can't be expected to in the future. AI could suddenly make most of the work force redundant without making housing or food free.

and that is explicitly about the relationship between food cost and earning power in the context of AI.

Comment author: Lumifer 26 January 2016 07:38:19PM *  1 point [-]

I was expressing my reservations about the "New tech isn't making food or housing cheaper fast enough" part.

Of course not everyone has earning potential greater than the cost living. That has always been so. People in this situation subsist on charity (e.g. of their family) or they die.

As to an AI making work force redundant, the question here is what's happening to the demand part. The situation where an AI says "I don't need humans, only my needs matter" is your classic UFAI scenario -- presumably we're not talking about that here. So if the AI can satisfy everyone's material needs (on some scale from basics to luxuries) all by itself, why would people work? And if it's not going to give (meat) people food and shelter, we're back to the "don't need humans" starting point -- or humans will run a parallel economy.

Comment author: gjm 26 January 2016 09:18:35PM 2 points [-]

I take it jacob_cannell has in mind neither a benevolent godlike FAI nor a hostile (or indifferent-but-in-competition) godlike UFAI, in either of which cases all questions of traditional economics are probably off the table, but rather a gradual encroachment of non-godlike AI on what's traditionally been human territory. Imagine, in particular, something like the "em" scenarios Robin Hanson predicts, where there's no superduperintelligent AI but lots of human-level AIs, probably the result of brain emulation or something very like it, who can do pretty much any of the jobs currently done by biological humans.

If the cost of running (or being) an emulated human goes down exponentially according to something like Moore's law, then we soon have -- not the classic UFAI scenario where humans are probably extinct or worse, nor the benevolent-AI scenario where everyone's material needs are satisfied by the AI -- but an economy that works rather like the one we have now except that almost any job that needs a human being to do it can be done quicker and cheaper by a simulated human being than by a biological one.

At that point, maybe some biological humans are owners of emulated humans or the hardware they run on, and maybe they can reap some or all the gains of the ems' fast cheap work. And, if that happens, maybe they will want some other biological humans to do jobs that really do need actual flesh. (Prostitution, perhaps?) Other biological humans are out of luck, though.

Comment author: TheAncientGeek 28 January 2016 09:47:27AM -1 points [-]

So if the AI can satisfy everyone's material needs (on some scale from basics to luxuries) all by itself, why would people work?

If people own the advanced robots or AIs that are responsible for most production, why would they be impoverished by them? More to the point, why would they want the majority of people who don't own automated factories to be impoverished, since that means they would have no-one to sell to? There's no law of economics saying that ina a wealthy society most people would starve, rather to keep an economy going in anything like its present form, you have to have redistribution. In such a future, tycoons would be pushing for basic income -- it's in their own interests.