Just someone wandering the internet. Someone smart, but not smart like all of you people. Someone silly and creative. Someone who wanders the internet to find cool, isolated areas like LessWrong.
The universe is so awesome and crazy and interesting and I can't wait for when humanity is advanced enough to understand all of it. While you people figure out solutions to the various messes our species is in, (I would prefer for homo sapiens to still exist in 20 years) I'll be standing by for emotional support because I'm nowhere near smart enough to be doing any of that actually important stuff. Remember to have good mental health while you're saving the world.
Pronouns: he/him
It's "101"? I searched the regular internet to find out, but I got some yes's and some no's, which I suspect were just due to different definitions of intelligence.
It's controversial?? Has that stopped us before? When was it done to death?
I'm just confused, because if people downvote my stuff, they're probably trying to tell me something, and I don't know what it is. So I'm just curious.
Thanks. By the way, do you know why this question is getting downvoted?
I already figured that. The point of this question was to ask if there could possibly exist things that look indistinguishable from true alignment solutions (even to smart people), but that aren't actually alignment solutions. Do you think things like this could exist?
By the way, good luck with your plan. Seeing people actively go out and do actually meaningful work to save the world gives me hope for the future. Just try not to burn out. Smart people are more useful to humanity when their mental health is in good shape.
Uh, this is a human. Humans find it much harder to rationalize away the suffering of other humans, compared to rationalizing animal suffering.
And the regular, average people in this future timeline consider stuff like this ethically okay?
hack reality via pure math
What - exactly - do you mean by that?
The above statement could be applied to a LOT of other posts too, not just this one.
How were these discovered? Slow, deliberate thinking, or someone trying some random thing to see what it does and suddenly the AI is a zillion times smarter?
If your utility function weights you knowing things higher than most people's, that is not an irrationality.