Grognor

Wiki Contributions

Comments

Sorted by
Grognor30

I suggest a new rule: the source of the quote should be at least three months old. It's too easy to get excited about the latest blog post that made the rounds on Facebook.

Grognor570

It is because a mirror has no commitment to any image that it can clearly and accurately reflect any image before it. The mind of a warrior is like a mirror in that it has no commitment to any outcome and is free to let form and purpose result on the spot, according to the situation.

—Yagyū Munenori, The Life-Giving Sword

Grognor10

You may find it felicitous to link directly to the tweet.

Grognor270

This reminds me of how I felt when I learned that a third of the passengers of the Hindenburg survived. Went something like this, if I recall:

Apparently if you drop people out of the sky in a ball of fire, that's not enough to kill all of them, or even 90% of them.

Grognor-10

I have become 30% confident that my comments here are a net harm, which is too much to bear and so I am discontinuing my comments here unless someone cares to convince me otherwise.

Edit: Good-bye.

[This comment is no longer endorsed by its author]Reply
Grognor00

Which is not the same thing as expecting a project to take much less time than it actually will.

Edit: I reveal my ignorance. Mea culpa.

[This comment is no longer endorsed by its author]Reply
Grognor130

Parts of this I think are brilliant, other parts I think are absolute nonsense. Not sure how I want to vote on this.

there is no way for an AI employing computational epistemology to bootstrap to a deeper ontology.

This strikes me as probably true but unproven.

My own investigations suggest that the tradition of thought which made the most progress in this direction was the philosophical school known as transcendental phenomenology.

You are anthropomorphizing the universe.

Grognor20

That isn't the planning fallacy.

Grognor20

This is a better explanation than I could have given for my intuition that physicalism (i.e. "the universe is made out of physics") is a category error.

Grognor60

Whether or not a non-self-modifying planning Oracle is the best solution in the end, it's not such an obvious privileged-point-in-solution-space that someone should be alarmed at SIAI not discussing it. This is empirically verifiable in the sense that 'tool AI' wasn't the obvious solution to e.g. John McCarthy, Marvin Minsky, I. J. Good, Peter Norvig, Vernor Vinge, or for that matter Isaac Asimov.

-Reply to Holden on Tool AI

Load More