Well, I hope that the self-importance shown in this post is not a true reflection of the community; although unfortunately I think it might well be.
One aspect of this I find anthropologically interesting is the motivations of Adams and Lahood. Spending years searching for a total stranger's dead body. Why? Why do we want to "know what happened" so badly? What is at stakes here? There is a movie like that - it's called The Vanishing. It's quite good.
What does it mean, fundamentally, when something is NOT where it is most likely to be (like Ewasko's body here, well outside of the most searched zone)? Or more generally when something - permanently - is NOT the way it is most likely to be? Does it mean our assessment of the likelyhoods was wrong?
Superhuman capabilities have a long tradition of being associated with God-figures. God-like refers to the human cultural construct of god(s), not to the actuality of a God. It's a short, but accurate, way to say "with properties that humans have historically / traditionally thought of as being associated with gods" which obviously wouldn't flow as well.
I believe they would sincerely welcome governments stepping in.
Exactly - that's the feeling even from merely reading between the lines on their public statements. They are begging politicians to please step in already and start regulating the hell out of them. To my knowledge, no other industry has ever done that. Industries are supposed to try and fight off regulation, not implore it.
And, finally, posterity.
Which is where it kind of loops out: for mankind to remember you as the man who delivered AGI, mankind needs to be around. Otherwise, sure: you would still have a couple of murderous machines fondly remembering you as "Daddy" - but it's probably not the same kick.
They are running towards a finish line without an understanding of what lies on the other side.
That's extremely well put.
This has little value other than letting people know of my internal emotional state: but this type of news cheers me up. My personal experience so far has been, around me, an ocean of apathy. People just don't take the issue / risk that seriously, I think because deep down they are still convinced that AGI (much less ASI) is the stuff of science fiction, that geeks like to talk about as if it was about to become a real thing - but reasonable people know it won't ("We know that the future will be like the past because the past was normal, and why wouldn't the future be normal, duh"). So, when I see mainstream media shaking itself out of this apathic failure of imagination - I rejoice.
Also, I derive some optimism from the fact that some of the darker sides of human nature are also some that can be most reliably counted on. When power-hungry people, who tend to run most things, particularly in politics, realize that yes: this could be a real threat to their power (that is: IF it is allowed to become so smart that it cannot be simply thought of as a tool to increase and consolidate their current power position, but as a competitor for said power) they will do their best to prevent its rise.
I doubt it's all that qualitatively different than the sorts of summits humanity has surmounted before
This seems to imply that we have surmounted the fields of physics, that all available knowledge in all subfields has been acquired whereas the most that can be claimed is that we have reduced the degree of our ignorance in some of those subfields. We have not - by any stretch of the imagination - mastered the field. Indeed, if we think we cannot push our understanding of AI, and related alignment problems, further than our current degree of understanding of physics, I think that is a strong point for the "stop everything, while we still can" case.
Oh that's an interesting way to approach things! If you were asked : a fair coin is tossed, what is the probability it will land on head - wouldn't you reply 1/2, and wouldn't you for your reply be relying on such a thing as conventional probability theory?