shminux comments on Tiling Agents for Self-Modifying AI (OPFAI #2) - Less Wrong

55 Post author: Eliezer_Yudkowsky 06 June 2013 08:24PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (260)

You are viewing a single comment's thread. Show more comments above.

Comment author: shminux 27 June 2013 10:36:17PM -1 points [-]

Just wondering why you see Jonah Sinick of high enough status to be worth explaining to what's been discussed on LW repeatedly. Or maybe I'm totally misreading this exchange.

Comment author: JonahSinick 27 June 2013 11:00:39PM *  0 points [-]

I'm puzzled as to what you think I'm missing: can you say more?

Comment author: Kawoomba 27 June 2013 11:09:46PM *  2 points [-]

Matching "first AGI will [probably] have internal structure analogous to that of a human" and "first AGI [will probably have] many interacting specialized modules" in a literal (cough uncharitable cough) manner, as evidenced by "heavier-than-air flying-machines had feathers and beaks". Your phrasing hints at an anthropocentric architectural bias, analogous to the one you specifically distance yourself from regarding values.

Maybe you should clarify that part, it's crucial to the current misunderstanding, and it's not clear whether by "interacting specialized modules" you'd also refer to "Java classes not corresponding to anything 'human' in particular", or whether you'd expect a "thalamus-module".

Comment author: JonahSinick 27 June 2013 11:33:58PM 2 points [-]

Matching "first AGI will [probably] have internal structure analogous to that of a human" and "first AGI [will probably have] many interacting specialized modules" in a literal (cough uncharitable cough) manner, as evidenced by "heavier-than-air flying-machines had feathers and beaks". Your phrasing hints at an anthropocentric architectural bias, analogous to the one you specifically distance yourself from regarding values.

I think that people should make more of an effort to pay attention to the nuances of people's statements rather than using simple pattern matching.

Maybe you should clarify that part, it's crucial to the current misunderstanding, and it's not clear whether by "interacting specialized modules" you'd also refer to "Java classes not corresponding to anything 'human' in particular", or whether you'd expect a "thalamus-module".

There's a great deal to write about this, and I'll do so at a later date.

To give you a small taste of what I have in mind: suppose you ask "How likely is it that the final digit of the Dow Jones will be 2 in two weeks." I've never thought about this question. A priori, I have no Bayesian prior. What my brain does, is to amalgamate

  1. The Dow Jones index varies in a somewhat unpredictable way
  2. The last digit is especially unpredictable.
  3. Two weeks is a really long time for unpredictable things to happen in this context
  4. The last digit could be one of 10 values between 0 and 9
  5. The probability of a randomly selected digit between 0 and 9 being 2 is equal to 10%

Different parts of my brain generate the different pieces, and another part of my brain combines them. I'm not using a single well-defined Bayesian prior, nor am I satisfying a well defined utility function.

Comment author: shminux 27 June 2013 11:57:08PM 0 points [-]

I don't want to comment on the details, as this is way outside my area of expertise, but I do want to point out that you appear to be a victim of the bright dilettante fallacy. You appear to think that your significant mathematical background makes you an expert in an unrelated field without having to invest the time and effort required to get up to speed in it.

Comment author: JonahSinick 28 June 2013 12:04:19AM *  0 points [-]

I don't claim to have any object level knowledge of AI.

My views on this point are largely based on what I've heard from people who work on AI, together with introspection as to how I and other humans reason, and the role of heuristics in reasoning.

Comment author: pop 15 July 2013 06:53:20AM 0 points [-]

Maybe something to do with Jonah being previously affiliated with GiveWell?