It is entertaining indeed that a non computer scientist entrepreneur (Elon Musk) is emotionally influenced by the incredibly fallacious pseudo-scientific bullshit of Nick Bostrom, another non-computer scientist, and that people are talking about it.
So let's see, a clown writes a book, and an investor thinks it is a credible book while it is not true. What makes this hilarious is people's reactions to it. A ship of fools.
Do you have any serious counter arguments to ideas presented in a Bostrom's book? Majority of top AI experts agree that we will have human-level AI by the end of this century, and people like Musk, Bostrom and MIRI guys are just trying to think about possible negative impacts that this development may have on humans. The problem is that the fate of humanity may depend on action of non-human actors, who will likely have utility function incompatible with human survival and it is perfectly rational to be worried about that.
Those ideas are definitely not abov...
Elon Musk submitted a comment to edge.org a day or so ago, on this article. It was later removed.
Now Elon has been making noises about AI safety lately in general, including for example mentioning Bostrom's Superintelligence on twitter. But this is the first time that I know of that he's come up with his own predictions of the timeframes involved, and I think his are rather quite soon compared to most.
We can compare this to MIRI's post in May this year, When Will AI Be Created, which illustrates that it seems reasonable to think of AI as being further away, but also that there is a lot of uncertainty on the issue.
Of course, "something seriously dangerous" might not refer to full blown superintelligent uFAI - there's plenty of space for disasters of magnitude in between the range of the 2010 flash crash and clippy turning the universe into paperclips to occur.
In any case, it's true that Musk has more "direct exposure" to those on the frontier of AGI research than your average person, and it's also true that he has an audience, so I think there is some interest to be found in his comments here.