eli_sennesh comments on The Brain as a Universal Learning Machine - Less Wrong

82 Post author: jacob_cannell 24 June 2015 09:45PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (166)

You are viewing a single comment's thread. Show more comments above.

Comment author: Houshalter 22 June 2015 09:22:04AM 5 points [-]

If you just said a bunch of trivial statements 1 billion times, and then demanded to give you money, it would seem extremely suspicious. It does not fit with your pattern of behavior.

If, on the other hand, you gave useful and non-obvious advice, I would do it. Because the demand to give you money wouldn't seem any different than all the other things you told me to do that worked out.

I mean, that's the essence of the human concept of earning trust, and betrayal.

Comment author: [deleted] 27 June 2015 12:25:26AM *  1 point [-]

Yes, but expecting any reasoner to develop well-grounded abstract concepts without any grounding in features and then care about them is... well, it's not actually complete bullshit, but expecting it to actually happen relies on solving some problems I haven't seen solved.

You could, hypothetically, just program your AI to infer "goodness" as a causal-role concept from the vast sums of data it gains about the real world and our human opinions of it, and then "maximize goodness", formulated as another causal role. But this requires sophisticated machinery for dealing with causal-role concepts, which I haven't seen developed to that extent in any literature yet.

Usually, reasoners develop causal-role concepts in order to explain what their feature-level concepts are doing, and thus, causal-role concepts abstracted over concepts that don't eventually root themselves in features are usually dismissed as useless metaphysical speculation, or at least abstract wankery one doesn't care about.

Comment author: Houshalter 27 June 2015 08:29:39AM 0 points [-]

I don't think you are responding the the correct comment. Or at least I have no idea what you are talking about.