27chaos comments on FAI Research Constraints and AGI Side Effects - Less Wrong

14 Post author: JustinShovelain 03 June 2015 07:25PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (58)

You are viewing a single comment's thread. Show more comments above.

Comment author: Gram_Stone 03 June 2015 09:04:41PM 5 points [-]

Formalizations can simultaneously be simple and useful. I'm reminded of things like Chapter 4 of Superintelligence and Bostrom's GCR model. These are relatively simple models, but they make very explicit things that we previously had only considered in natural language. Attention is a limited resource, and things like this allow us to focus our attention on this model's inputs, that is, what observations we should be making in the empirical case, and allow us to focus on other things to formalize in the theoretical case. Technological strategy cannot be discussed in natural language forever if we are to make substantial progress, and now we have a better idea of what to measure.

Comment author: 27chaos 03 June 2015 09:28:54PM -2 points [-]

I hope we see such progress soon.