buybuydandavis comments on My new paper: Concept learning for safe autonomous AI - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (16)
I was immediately confused by the first two sentences in the abstract:
We may need something that can't be done, but wait, we do require it, so I guess we better figure out how.
Are you making a distinction between defining and specifying?
If you just removed
I'm not sure what you're trying to add to the abstract with that phrase, but as is it mainly adds confusion for me.
I read this as distinguishing between (on the one hand) an externally defined set of parameters and (on the other hand) a locally emergent pattern that may be too complex to be readily understood but which nonetheless produces behavior that conforms to our expectations for the concept. Consider Google's surprising 2012 discovery of cats.
You can teach somebody about the moon by describing it very precisely, or you can teach them about the moon by pointing to the moon and saying 'that thing.' In the latter case, you have specified a concept without defining it.
That's a lot of meaning to be hanging on "defining" and "specifying".
Could entirely be what he meant. I guessed something similar, but I wouldn't want a reader having to guess at the meaning of an abstract.
Thanks for pointing that out, I didn't realize that the intended meaning was non-obvious! Toggle's interpretation is basically right: "rigorously defined" is referring to something like giving the system a set of necessary and sufficient criteria for when something should qualify as an instance of the concept. And "specifying" is intended to refer to something more general, such as building the system in such a way that it's capable of learning the concepts on its own, without needing an exhaustive (and impossible-to-produce) external definition of them. But now that you've pointed it out, it's quite true that the current choice of words doesn't really make that obvious: I'll clarify that for the final version of the paper.
Looks like you're making distinctions between how you're going to build something that has the desired behavior. That how would be the specification.
These concepts could be explicitly specified set theoretically as concepts, or specified by defining boundaries in some conceptual space, or more generally, specified algorithmically as the product of information processing system with learning behavior and learning environment, without initially explicitly creating a conceptual representation.
It's not that one way is rigorous, and one is not, but that they are different ways of creating something with the desired behavior, or in your particular case, different ways between creating the concepts you want to use in producing the desired behavior. The distinction between a conceptual specification and an algorithmic specification seems meaningful and useful to me,
I think this works as a drop in replacement for the first two sentences:
I assumed that the type of AI design you're exploring is structurally committed to creating those concepts, instead of simply creating algorithms with the desired behavior, or I would have made more general statements about functionality.
Whatever you think of my proposed wording, and even if you don't like the distinctions I've made, the crucial word that I've added is but - an adversity conjuction. But, while, instead, ... a word to balance the things you're trying to make the distinction between, thereby identifying them. The meaning you intended in the first two sentences was a tension or conflict, but the grammar and sentence structure didn't reflect that.
Thanks. I ended up going with:
At least for me, this very clearly identifies the problem and your proposed approach to tackling it.