There's a lot of status quo bias here. Once upon a time, elevators and telephones had operators, but no longer.
The problem with it is that the new jobs that still need people to do them are getting more difficult.
This is an important fact, if true. There are obvious lock-in effects. For example, unemployed auto workers have skills that are no longer valued in the market because of automation. But the claim that replacement jobs are systematically more difficult, so that newly unemployed lack the capacity to learn the new jobs, is a much stronger claim.
But the claim that replacement jobs are systematically more difficult, so that newly unemployed lack the capacity to learn the new jobs, is a much stronger claim.
Yes. It's obviously true that useful things that are easier to automate will get automated more, so the job loss should grow from the easily automated end. The open question is how much do human skill distributions and the human notion of 'difficulty' match up with the easier to automate thing. It's obviously not a complete match, as a human job, bookkeeping is considered to require more skill ...
Today's post, Traditional Capitalist Values was originally published on 17 October 2008. A summary (taken from the LW wiki):
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Entangled Truths, Contagious Lies, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.