Personal Note: I would like to thank Normal Anomaly for beta-ing this for me and providing counter-arguments. It am asking him/her to comment below, so that everyone can give him/her karma for volunteering and helping me out. Even if you dislike the article, I think it's awesome that they were willing to take time out of their day to help someone they've never met.
Imagine that you live in a world where everyone says "AI is a good idea. We need to pursue it."
Sounds great!
But what if no one really thought that there was any reason to make sure the AI was friendly. That would be bad, right? You would probably think: "Hey, AI is a great goal and all, but before we start pursuing it and actually developing the technology, we need to make sure that it's not going to blow up in our faces!"
That seems to me to be a rational response.
Yet it seems like most people are not applying the same thought processes to life-extending technology. This website in particular has a habit of using some variant of this argument: "Death is bad. Not dying is good. Therefore life-extending technologies are also good" However this is missing the same level of contemplation that has been given to AI. Like AI, there are considerations that must be made to ensure this technology is "friendly".
Most transhumanists have heard many of these issues before, normally sandwiched inside of a "Death is Bad" conversation. However these important considerations are often hand-waved away, as the conversation tends to stick to the low-hanging fruit. Here, I present them all in one place, so we can tackle them together, and perhaps come up with some solutions:
- Over-population: For example, doubling the life-span of humans would at the very least double the number of people on this planet. If we could double life-spans today, we would go from 7 billion to 14 billion people on Earth in 80 years, not counting regular population growth.
Although currently birthrates are falling, all birthrate information we have is for women being fertile for approximately 25 years. This has not changed much throughout history, so we cannot necessarily extrapolate the current birthrate to what it would be if women were fertile for 50 years instead.
In other words, not only will there be a population explosion due to people living longer, but I'd be willing to bet that if life-extension was available today, birth rates would also go up. Right now, people who like to have kids only have enough money and fertile years to raise on average 2-3 kids. If you doubled the time they would have to reproduce, you will likely double the amount of children that child-rearing families have.
For example, in modern society, by the time a woman's children are out of the house and done with college, the woman is no longer young and/or fertile. Say for example you had a child when you were 25. By the time your children were 20 you would be 45, and therefore not at a comfortable age to have children. However, if 45 becomes a young/fertile age for women, families might likely decide to re-reproduce.
It's one thing to say: "Well, we will develop technology to increase food yields and decrease fossil food consumption", but are you positive we will have those technologies ready to go in time to save us? - Social Stagnation: Have you ever tried having a long conversation with an elderly person, only to realize that they are bigots/homophobes/racists, etc? We all love Grandpa John and Grammy Sue, but they have to die for society to move forward. If there were 180 year-olds alive today, chances are pretty strong that a good amount of them would think that being anti-slavery is pretty progressive. They would have been about 90 years old when women got the right to vote.
We don't so much change our minds, and we grow new people and the old ones die. - Life sucks, but at least you die: The world is populated with people suffering with mental disorders like depression, social issues like unemployment, and physical deprivations like poverty and hunger.
It doesn't make sense to extend life until we have made our lives worth extending. - Unknown Implications: How will this change the way society works? How will it change how people live their lives? We can have some educated guesses, but we won't know for sure what far-spread effects this would have.
I have a friend who is a professional magician and "psychic", and about a month ago I convinced him to read HPMoR. After cursing me for ruining his sleep schedule for two days, we ended up having a discussion about some of the philosophies in there that we agreed and disagreed with. I was brand-new to LW. He had no prior knowledge of "rationality", but like most of his profession was very analytically minded. I would like to share something he wrote:
We have a lot of ancient wisdom telling us that wishes are bad because we aren't wise, and you're saying... that if we could make ourselves wise, then we can have wishes and not have it blow up in our faces.See the shortest version of Alladin's Tale:
Wish One: "I wish to be wise."
The End.
Since... I am NOT mature, fully rational, and wise,
I really think I shouldn't have wishes,
Of which, immortality is an obvious specific example.
Because I'm just not convinced
That I can predict the fallout.
I call this "The CEV of Immortality", although at the time, neither of us had heard of the concept of CEV in the first place. The basic idea being that we are not currently prepared enough to even be experimenting with life-extending technologies. We don't know where it will lead and how we will cope.
However scientists are working on these technologies right now, discovering genes that cause proteins that can be blocked to greatly increase life-spans of worms, mice and flies. Should a breakthrough discovery be made, who knows what will happen? Once it's developed there's no going back. If the technology exists, people will stop at nothing to use it. You won't be able to control it.
Just like AI, life-extending technologies are not inherently "bad". But supporting the development of life-extending technnologies without already answering the above questions is like supporting the development of AI without knowing how to make it friendly. Once it's out of the box, it's too late.
Counter-arguments
(Provided by Normal Anomaly)
Overpopulation Counter-argument: Birth rates are currently going down, and have fallen below replacement in much of the developed world (including the US). According to an article in The Economist last year, population will peak at about 10-11 billion in about 2050. This UN infographic appears to predict that fewer people will be born in 2020-2050 then were born in 1980-2010. I am skeptical that birth rate will increase with life extension. Space colonization is another way of coping with more people (again on a longer timescale than 40 years.) Finally, life extension will probably become available slowly, at first only a few extra years and only for the wealthy. This last also applies to “unknown implications.”
Social Stagnation Counter-argument: This leads to a slippery slope argument for killing elderly people; it’s very unlikely that our current lifespans are at exactly the right tradeoff between social progress and life. Banning elderly people from voting or holding office would be more humane for the same results. "Life sucks" Counter argument: This is only an argument for working on making life worth extending, or possibly an argument for life extension not having the best marginal return in world-improvement. Also, nobody who doesn’t want to live longer would have to, so life extension technology wouldn’t result in immortal depressed people.
These counter-arguments are very good points, but I do not think it is enough to guarantee a 100% "Friendly" transhumanism. I would love to see some discussions on them.
Like last time I posted, I am making some "root" comments. They are: General comments, Over-population, Social stagnation, Life sucks, Unknown consequences. Please put your comment under the root it belongs to, in order to help keep the threads organized. Thank you!
This is Whig history. I am not confidant that we are morally superior to prior generations and that future generations will be morally superior to us. Making this argument would require a non-trivial amount of intellectual labor.
A number of people have issued some variant of this response, to which I reply:
We don't have to say that society is getting better, only that it is changing. If society is not changing then it is stagnant. If society is changing, then those best suited to whatever society currently exists are those born relatively recently.
In other words, maybe the s... (read more)