I came here just to post "Ulysses"!
And the accompanying song: "Untraveled Worlds" by Paul Halley. The song has lots of sentimental value to me because I sang it in choir when I was in sixth grade. Out of ten years' worth of songs, I chose to have the choir sing it again during my last year of high school.
I would also quote:
Come, my friends,
'T is not too late to seek a newer world.
Push off, and sitting well in order smite
The sounding furrows; for my purpose holds
To sail beyond the sunset, and the baths
Of all the western stars, until I die.
Reminds me of the prophecy from HPMOR: "the one who will tear apart the very stars in heaven". It's just so utterly epic. And I'm not even the space-travel type!
This reasoning doesn't look right to me. Am I missing something you mentioned elsewhere?
The way I understand it, the argument goes:
Here is is cleaned up with an abbreviation: say X is some difficult task, such as solving the alignment problem.
The jump from (1) to (2) doesn't work: just because an AGI wants to X, it's not necessarily true that it can X. This is true by definition for any non-omnipotent entity.
The jump from (1) to (2) does work if we're considering an omnipotent AGI. But an omnipotent AGI breaks the jump from (2) to (3), so the chain of reasoning doesn't work for any AGI power level. Just because an omnipotent AGI can X, it's not necessarily true that humanity is more likely to be able to X.
Overall, this argument could be used to show that any X desired by an AGI is therefore more likely to be doable by humans. Of course this doesn't make sense--we shouldn't expect it to be any easier to build a candy-dispensing time machine just because an AGI would want to build one to win the favor of humanity.