rlsj comments on Superintelligence Reading Group - Section 1: Past Developments and Present Capabilities - Less Wrong

25 Post author: KatjaGrace 16 September 2014 01:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (232)

You are viewing a single comment's thread. Show more comments above.

Comment author: rlsj 16 September 2014 07:56:58PM 7 points [-]

Excuse me? What makes you think it's in control? Central Planning lost a lot of ground in the Eighties.

Comment author: Liso 19 September 2014 04:13:22AM *  3 points [-]

This is good point, which I like to have more precisely analysed. (And I miss deeper analyse in The Book :) )

Could we count will (motivation) of today's superpowers = megacorporations as human's or not? (and in which level could they control economy?)

In other worlds: Is Searle's chinese room intelligent? (in definition which The Book use for (super)intelligence)

And if it is then it is human or alien mind?

And could be superintelligent?

What arguments we could use to prove that none of today's corporations (or states or their secret services) is superintelligent? Think collective intelligence with computer interfaces! Are they really slow at thinking? How could we measure their IQ?

And could we humans (who?) control it (how?) if they are superintelligent? Could we at least try to implement some moral thinking (or other human values) to their minds? How?

Law? Is law enough to prevent that superintelligent superpower will do wrong things? (for example destroy rain forrest because he want to make more paperclips?)

Comment author: KatjaGrace 22 September 2014 04:19:56AM 2 points [-]

Good question.

I don't think central planning vs. distributed decision-making is relevant though, because it seems to me that either way humans make decisions similarly much: the question is just whether it is a large or a small number making decisions, and who decides what.

I usually think of the situation as there being a collection of (fairly) goal-directed humans, each with different amounts of influence, and a whole lot of noise that interferes with their efforts to do anything. These days humans can lose control in the sense that the noise might overwhelm their decision-making (e.g. if a lot of what happens is unintended consequences due to nobody knowing what's going on), but in the future humans might lose control in the sense that their influence as a fraction of the goal-directed efforts becomes very small. Similarly, you might lose control of your life because you are disorganized, or because you sell your time to an employer. So while I concede that we lack control already in the first sense, it seems we might also lose it in the second sense, which I think is what Bostrom is pointing to (though now I come to spell it out, I'm not sure how similar his picture is to mine).

Comment author: cameroncowan 19 October 2014 06:50:59PM 0 points [-]

The economy is a group of people making decisions based on the actions of others. Its a non centrally regulated hive mind.

Comment author: rcadey 21 September 2014 08:35:05PM -1 points [-]

I have to agree with rlsj here - I think we're at the point where humans can no longer cope with the pace of economic conditions - we already have hyper low latency trading systems making most of the decisions that underly the current economy. Presumably the limit of economic growth will be linked to "global intelligence" - we seem to be at the point where with human intelligence is the limiting factor (currently we seem to be unable to sustain economic growth without killing people and the planet!)