Note to self: do not buy stuff from Nancy Lebovitz.
Better yet, don't go gaga. And use anchoring to your advantage - before haggling, talk about something you got for free.
"A group of Singularity Institute donors has stepped forward to match all donations given through Philanthroper today"
Does this make sense? How do we know they wouldn't have given that money anyway?
Well, if the donations they have to match go beyond what they'd donate anyway, they would donate more than they otherwise would. Plus, the goal is to get YOU to donate more than you otherwise would.
What's up with the word "foom", and why is it always in all caps? Can we come up with another name for this that doesn't sound like a sci-fi nerd in need of Ritalin?
Well, not unmodified humans. You don't execute a 21st century jailbreak with spears and a loincloth. The outside world is not as resource-limited - and so it has some chance of gathering useful information from the attempt.
And if they're modified? It's a superintelligent AI. You can't take it down with a shotgun, even if it's built into your arm.
You don't see why people would want to break into a compound containing the first machine intelligence?
Sure, but it's their funeral.
Another AI might succeed, but not humans. I think there would be at least a few weeks before another one appears, and that might be enough time to ask it how to make a true FAI.
One obvious problem will be people trying to break in. They have all the resources of the outside world to attempt that with.
Well, then they'll have themselves to blame when the AI converts their remains into nanomachines.
Not sure what you're saying.
Unfortunately it can't even limit itself to this. Every object with mass exerts a gravitational attraction on every other object, it can't help but affect the world outside through these means as well, so we have to allow it to do so, which may result in disaster for all we know. We also have to allow some radiation out, since this is also unavoidable. At this point I should point out that detonating a nuclear warhead can probably be fit into the category of "emitting waste heat and radiation".
I did mention explosions. And gravity? I don't see what it could do with gravity. Although I see that it could do something with vibration.
It's allowed to produce waste heat. I see no reason to let it make anything else. I know it can't actially cut itself off from the universe, but it shouldn't enjoy this fact.
I suspect that giving an unfriendly superintelligent AI "ten thousand cubic meters" of space will probably mean the end of humanity. Though some of the other ideas here are good, this one is pretty worrisome.
Why? This is the whole point - to prevent it from interacting with anything not intentionally given to it.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I will repost a quote that I posted many moons ago on OB, if you don't mind. I don't THINK this breaks the rules too badly, since that post didn't get its fair share of karma. Here's the first time: http://lesswrong.com/lw/uj/rationality_quotes_18/nrt
"He knew well that fate and chance never come to the aid of those who replace action with pleas and laments. He who walks conquers the road. Let his legs grow tired and weak on the way - he must crawl on his hands and knees, and then surely, he will see in the night a distant light of hot campfires, and upon approaching, will see a merchants' caravan; and this caravan will surely happen to be going the right way, and there will be a free camel, upon which the traveler will reach his destination. Meanwhile, he who sits on the road and wallows in despair - no matter how much he cries and complains - will evoke no compassion in the soulless rocks. He will die in the desert, his corpse will become meat for foul hyenas, his bones will be buried in hot sand. How many people died prematurely, and only because they didn't love life strongly enough! Hodja Nasreddin considered such a death humiliating for a human being.
"No" - said he to himself and, gritting his teeth, repeated wrathfully: "No! I won't die today! I don't want to die!""