For something to "exist", it must relate, somehow, to something else, right?
If so, everything relates to everything else by extension, and to some degree, thus "it's all relative".
Some folk on LW have said I should fear Evil AI more than Rogue Space Rock Collisions, and yet, we keep having near misses with these rocks that "came out of nowhere".
I'm more afraid of humans humaning, than of sentient computers humaning.
Is not the biggest challenge we face the same as it has been— namely spreading ourselves across multiple rocks and other places in space, so all our eggs aren't on a single rock, as it were?
I don't know. I think so. But I also think we should do things in as much as a group as possible, and with as much free will as possible.
If I persuade someone, did I usurp their free will? There's strength in numbers, generally, so the more people you persuade, the more people you persuade, so to speak. Which is kind of frightening.
What if the "bigger" danger is the Evil AI? Or Climate Change? Or Biological Warfare? Global Nuclear Warfare would be bad too. Is it our duty to try to organize our fellow existence-sharers, and align them with working towards idea X? Is there a Root Idea that might make tackling All of the Above™ easier?
Is trying to avoid leadership a cop-out? Are the ideas of free will, and group alignment, at odds with each other?
Why not just kick back and enjoy the show? See where things go? Because as long as we exist, we somehow, inescapably, relate? How responsible is the individual, really, in the grand scheme of things? And is "short" a relative concept? Why is my form so haphazard? Can I stop this here[1]?
For something to "exist", it must relate, somehow, to something else, right?
If so, everything relates to everything else by extension, and to some degree, thus "it's all relative".
Some folk on LW have said I should fear Evil AI more than Rogue Space Rock Collisions, and yet, we keep having near misses with these rocks that "came out of nowhere".
I'm more afraid of humans humaning, than of sentient computers humaning.
Is not the biggest challenge we face the same as it has been— namely spreading ourselves across multiple rocks and other places in space, so all our eggs aren't on a single rock, as it were?
I don't know. I think so. But I also think we should do things in as much as a group as possible, and with as much free will as possible.
If I persuade someone, did I usurp their free will? There's strength in numbers, generally, so the more people you persuade, the more people you persuade, so to speak. Which is kind of frightening.
What if the "bigger" danger is the Evil AI? Or Climate Change? Or Biological Warfare? Global Nuclear Warfare would be bad too. Is it our duty to try to organize our fellow existence-sharers, and align them with working towards idea X? Is there a Root Idea that might make tackling All of the Above™ easier?
Is trying to avoid leadership a cop-out? Are the ideas of free will, and group alignment, at odds with each other?
Why not just kick back and enjoy the show? See where things go? Because as long as we exist, we somehow, inescapably, relate? How responsible is the individual, really, in the grand scheme of things? And is "short" a relative concept? Why is my form so haphazard? Can I stop this here[1]?
lol[2], maybe the real challenge, and Key Root Idea®, relates to self control and teamwork…
At least I crack me up. :) "not it!" FIN
LOL! Gesturing in a vague direction is fine. And I get it. My kind of rationality is for sure in the minority here, I knew it wouldn't be getting updoots. Wasn't sure that was required or whatnot, but I see that it is. Which is fine. Content moderation separates the wheat from the chaff and the public interwebs from personal blogs or whatnot.
I'm a nitpicker too, sometimes, so it would be neat to suss out further why the not new idea that “everything in some way connects to everything else" is "false" or technically incor... (read more)