and more specifically you should not find yourself personally living in a universe where the history of your experience is lost. I say this because this is evidence that we will likely avoid a failure in AI alignment that destroys us, or at least not find ourselves in a universe where AI destroys us all, because alignment will turn out to be practically easier than we expect it to be in theory.
Can you elaborate on this idea? What do you mean by 'the history of your experience is lost'? Can you supply some links to read on this whole theory?
Could you qualify that statement?
Can you make an AGI given only primordial soup?
An AI will have a utility function. What utility function do you propose to give it?
What values would we give an AI if not human ones? Giving it human values doesn't necessarily mean giving it the values of our current society. It will probably mean distilling our most core moral beliefs.
If you take issue with that all you are saying is that you want an AI to have your values, rather than humanity's, as a whole.
Developing an AGI (and then ASI) will likely involve a serious of steps involving lower intelligences. There's already an AI arms race between several large technology companies and keeping your nose in front is already practiced because there's a lot of utility in having the best AI so far.
So it isn't true to say that it's simply a race without important intermediate steps. You don't just want to get to the destination first, you want to make sure your AI is the best for most of the race for a whole heap of reasons.
That's a partial list. It also takes good universities, a culture that produces a willingness to take risks, a sufficient market for good products, and I suspect a litany of other things.
I think once you've got a society that genuinely innovates started, it can be hard to kill that off, but it can be and has been done. The problem is, as you mentioned, very few societies have ever been particularly innovative.
It's easy to use established technology to build a very prosperous first world society. For example: Australia, Canada, Sweden. But it's much harder ...
I think it's an interesting point about innovation actually being very rare, and I agree. It takes a special combination of things for to happen and that combination doesn't come around much. Britain was extremely innovative a few hundred years ago. In fact, they started the industrial revolution, literally revolutionising humanity. But today they do not strike me as particularly innovative even with that history behind them.
I don't think America's ability to innovate is coming to end all that soon. But even if America continues to prosper, will that mean...
You have failed to answer my question. Why does anything at all matter? Why does anything care about anything at all? Why don't I want my dog to die? Obviously, when I'm actually dead, I won't want anything at all. But there is no reason I cannot have preferences now regarding events that will occur after I am dead. And I do.
Why does anything at all matter?
In Australia we currently produce enough food for 60 million people. This is without any intensive farming techniques at all. This could be scaled up by a factor of ten if it was really necessary, but quality of life per capita would suffer.
I think smaller nations are as a general rule governed much better, so I don't see any positives in increasing our population beyond the current 24 million people.
Each human differs in their values. So it is impossible to build the machine of which you speak.
Raid Google and shut them down immediately. Start a Manhattan project of AI safety research.
I really like that you mention world government as an existential risk. It's one of the biggest ones. Competition is a very good risk reduction process. It has been said before that if we all lived in North Korea, it may well be that the future of humanity would be quite bleak indeed. North Korea is less stable now than it would be if it was the world's government because all sorts of outside pressure contribute to its instability (technology created by more free nations, pressure from foreign governments, etc).
No organisation can ever get it right all th...
You might not care, but a lot of humans do care, and will continue to care. That's why we're discussing it.
There have been wars over land since humans have existed. And non interaction, even if initially widespread, clearly eventually stopped when it became clear the world wasn't infinite and that particular parts had special value and were contested by multiple tribes. Australia being huge and largely empty didn't stop European tribes from having a series of wars increasing in intensity until we had WW1 and WW2, which were unfathomably violent and huge clashes over ideology and resources. This is what happened in Europe, where multiple tribes of comparable st...
Remember also that viruses that kill lots of people tend to rapidly mutate into less lethal strains due to evolutionary pressures. This is what happened with the 1917 pandemic.
Extremely low. I have never believed any sort of pathogen could come close to wiping us out. They can be defeated by basic breather and biohazard technology. But the main key is that with improved and more accessible biotechnology, our ability to create vaccines and other defence mechanisms against pathogens is greatly enhanced. I actually think the better biotechnology gets, the less likely any pathogen is to wipe us out, even given the fact that terrorists will be able to misuse it more easily.
Kicking the can down the road doesn't seem to be a likely action of an intelligent civilisation.
Best to control us while they still can, or while the resulting war will not result in unparalleled destruction.
The development of Native Americans has been stunted and they simply exist within the controlled conditions imposed by the new civilization now. They aren't all dead, but they can't actually control their own destiny as a people. Native American reservations seem like exactly the sort of thing aliens might put us in. Very limited control over our own affairs in desolate parts of the universe with the addition of welfare payments to give us some sort of quality of life.
If an exact copy of you were to be created, it would have to be stuck in the hole as well. If the 'copy' is not in the hole, then it is not you, because it is experiencing different inputs and has a different brain state.