I'm an admin of LessWrong. Here are a few things about me.
Analogously: "I am claiming that people when informed will want horses to continue being the primary mode of transportation. I also think that most people when informed will not really care that much about economic growth, will continue to believe that you're more responsible for changing things than for maintaining the status quo, etc. And that this is a coherent view that will add up to a large set of people wanting things in cities to remain conservatively the same. I separately claim that if this is true, then other people should just respect this preference, and go find new continents / planets on which to build cars that people in the cities don't care about."
Sometimes it's good to be conservative when you're changing things, like if you're changing lots of social norms or social institutions, but I don't get it at all in this case. The sun is not a complicated social institution, it's primarily a source of heat and light and much of what we need can be easily replicated especially when you have nanobots. I am much more likely to grant that we should be slow to change things like democracy and the legal system than I am that we should change exactly how and where we should get heat and light. Would you have wanted conservatism around moving from candles to lightbulbs? Installing heaters and cookers in the house instead of fire pits? I don't think so.
I was scrolling for a while, assuming I'd neared the end, only to look at the position of the scrollbar and find I was barely 5% through! This must have taken a fair bit of effort. I really like the helpful page and I'm glad I know about it, I encourage you to make a linkpost for it sometime if you haven't already.
(Meta: Apologies for running the clock, but it is 1:45am where I am and I'm too sleepy to keep going on this thread, so I'm bowing out for tonight. I want to respond further, but I'm on vacation right now so I do wish to disclaim any expectations of a speedy follow-up.)
Side-note: Just registering that I personally aspire to always taboo 'normal people' and instead name to specific populations. I think it tends to sneak in a lot of assumptions to call people 'normal' – I've seen it used to mean "most people on Twitter" or "most people in developed countries" or "most working class people" or "most people alive today" – the latter of which is not at all normal by historical standards!
Thanks for adding that one, I accidentally missed the first reference in the song.
I concede that I was mistaken in saying it was no argument; I will agree with the position that it is a very weak one and is often outweighed by other arguments.
Majority vote is useful specifically in determining who has power because of the extremely high level of adversarial dynamics, but in contexts that are not as wildly adversarial (including most specific decisions that an institution makes) generally other decision-making algorithms are better.
I took a quick look. I did not quite find this, I found other discussion of suns dying or being used as resources. Sharing as data.
In the song "Five Thousand Years" the lyrics talk about the sun dying in the next 5,000 years.
I don't quite know how things might change
I don't quite know what rules we'd break
Our present selves might think it strange
But there's so many lives at stake...
Entropy is bearin' down
But we got tricks to stick around.
And if we live to see the day
That yellow fades to red then grey,
We'll take a moment, one by one
Turn to face the dying sun
Bittersweetly wave goodbye--
The journey's only just begun...
In (Five thousand years)
(Whatcha want to do, whatcha wanna see, in another)
(Five million years)
(Where we want to go, who we want to be, in another)
Here's a reference to it as a battery, in the (fast, humerous, upbeat) song "The Great Transhumanist Future"
In the Great Transhumanist Future,
There are worlds all fair and bright,
We’ll be constrained by nothing but
The latency of light
When the hospitals are empty
And the sun’s a battery
Making it a breeze
To get outta deep freeze
To give humans wings
And some other things
In the Great Transhumanist Future.
Most decisions are not made democratically, and pointing out that a majoritarian vote is against a decision is no argument that they will not happen nor should not happen. This is true of the vast majority of resource allocation decisions such as how to divvy up physical materials.
You are putting words in people's mouths to accuse lots of people of wanting to round up the Amish and hauling them to extermination camps, and I am disappointed that you would resort to such accusations.
It is good to have deontological commitments about what you would do with a lot of power. But this situation is very different from "a lot of power", it's also "if you were to become wiser and more knowledgeable than anyone in history so far". One can imagine the Christians of old asking for a commitment that "If you get this new scientific and industrial civilization that you want in 2,000 years from now, will you commit to following the teachings of Jesus?" and along the way I sadly find out that even though it seemed like a good and moral commitment at the time, it totally screwed my ability to behave morally in the future because Christianity is necessarily predicated on tons of falsehoods and many of its teachings are immoral.
But there is some version of this commitment I think is good to make... something like "Insofar as the players involved are all biological humans, I will respect the legal structures that exist and the existence of countries, and will not relate to them in ways that would be considered worthy of starting a war in its defense". But I'm not certain about this, for instance what if most countries in the world build 10^10 digital minds and are essentially torturing them? I may well wish to overthrow a country that is primarily torture with a small number of biological humans sitting on thrones on top of these people, and I am not willing to commit not to do that presently.
I understand that there are bad ethical things one can do with post-singularity power, but I do not currently see a clear way to commit to certain ethical behaviors that will survive contact with massive increases in knowledge and wisdom. I am interested if anyone has made other commitments about post-singularity life (or "on the cusp of singularity life") that they expect to survive contact with reality?
Added: At the very least I can say that I am not going to make commitments to do specific things that violate my current ethics. I have certainly made no positive commitment to violate people's bodily autonomy nor have such an intention.