The question that the ideas are supposed to be in response to is:
What are the critical infrastructures that only government can help provide that are needed to enable creation of new biotechnology, nanotechnology, and information technology products and innovations -- a technological congruence that we have been calling the “Golden Triangle" -- that will lead to new jobs and greater GDP?"
Here are links to some proposed ideas that you should vote for, assuming you agree with them. You do have to register to vote, but the email confirmation arrives right away and it shouldn't take much more than two minutes of your time altogether. Why should you do this? The top voted ideas from this request for ideas will be seen by some of the top policy recommendation makers in the USA. They probably won't do anything like immediately convene a presidential panel on AGI, but we are letting them know that these things are really important.
Research the primary cause of degenerative diseases: aging / biological senescence
Explore proposals for sustaining the economy despite ubiquitous automation
Establish a Permanent Panel or Program to Address Global Catastrophic Risks, Including AGI
Does anyone have any other ideas? Feel free to submit them directly to ideascale, but it may be a better idea to first post them in the comments of this post for discussion.
Not without some changes; yes - and: not part of the human economy.
Various machines certainly behave in goal-directed ways - and so have what can usefully be described as "vested interests" - along the lines described here:
http://en.wikipedia.org/wiki/Vested_interest
Can you say what you mean by "interests"? Probably any difference of opinion here is a matter of differing definitions - and so is not terribly interesting.
Re: "The fact that machines are not exclusively on our side simply means that they do not perfectly fulfill our values."
That wasn't what I meant - what I meant is that they don't completely share human values - not that they don't fulfill them.
By interests, I mean concerns related to fulfilling values. For the time being, I consider human minds to be the only entities complex enough to have values. For example, it is very useful to model a cancer cell as having the goal of replicating, but I don't consider it to have replicating as a value.
The cancer example also shows that our own cells don't fulfill or share our values, and yet we still model the consumption of cancer cells as the consumption of a human being.
... (read more)