You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

is4junk comments on Have you changed your mind recently? - Less Wrong Discussion

8 Post author: Snorri 06 February 2015 06:34PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (43)

You are viewing a single comment's thread. Show more comments above.

Comment author: is4junk 07 February 2015 05:13:30PM *  0 points [-]

Why not try to exploit the singularity for fun and profit? Its like you have an opportunity to buy Apple stock dirt cheap.

  • Investment: own data center stocks initially. I am not sure what you would transition to when the AI learns to make CPUs.
  • Regulatory: make the singularity pay you rent by being a gatekeeper. This will be a large industry worldwide. Probably the best bet.

At the very least you should be able to rule out bad investments (time or money).

  • Energy
  • Land
  • Jobs that will be automated
Comment author: adamzerner 07 February 2015 05:27:24PM 0 points [-]

Hm. Well once/if the singularity does happen, I would think that it'd be beyond my ability to manipulate. But I think that your points are valid in reference to the time leading up to it.

Regulatory: make the singularity pay you rent by being a gatekeeper. This will be a large industry worldwide. Probably the best bet.

Could you explain this a bit more? I don't understand how anyone could be a gatekeeper.

Comment author: is4junk 07 February 2015 06:22:00PM 0 points [-]

I mean it in this non-flattering sense rent-seeking.

I envision all sorts of arbitrary legal limits imposed on AIs. These limits will need people to dream them up, evangelize the needs for even more limits, and enforce the limits (likely involving creation of other 'enforcer' AIs). Some of the limits (early on) will be good ideas but as time goes on they will be more arbitrary and exploitable. If you want examples just think of what laws they will try to stop unfriendly AI and stop individuals from using AI to do evil (say with an advanced makerbot).

Once you have a role in the regulatory field then converting it to fun and profit is a straight forward exercise in politics. How many people are in this role is determined by how successful it is at limiting AIs.

Comment author: adamzerner 07 February 2015 07:13:24PM *  0 points [-]

Ah ok. I was assuming that if a singularity occurred it'd be beyond our control, and that our fate would be determined by how the AI was originally programmed. But my reason for assuming this is based on much limited information, so I don't really know. If it were the case that people with political power control AI, then I think that you are very right.

But if you're right and we live in a society where there is ASI level power that is controlled by people with political power... that really really scares me. My intuition is that it'd be just a matter of time before someone screws up. I'm not sure what to think of this...