Comment author: Tiiba 06 April 2011 05:26:59AM 6 points [-]

I will repost a quote that I posted many moons ago on OB, if you don't mind. I don't THINK this breaks the rules too badly, since that post didn't get its fair share of karma. Here's the first time: http://lesswrong.com/lw/uj/rationality_quotes_18/nrt

"He knew well that fate and chance never come to the aid of those who replace action with pleas and laments. He who walks conquers the road. Let his legs grow tired and weak on the way - he must crawl on his hands and knees, and then surely, he will see in the night a distant light of hot campfires, and upon approaching, will see a merchants' caravan; and this caravan will surely happen to be going the right way, and there will be a free camel, upon which the traveler will reach his destination. Meanwhile, he who sits on the road and wallows in despair - no matter how much he cries and complains - will evoke no compassion in the soulless rocks. He will die in the desert, his corpse will become meat for foul hyenas, his bones will be buried in hot sand. How many people died prematurely, and only because they didn't love life strongly enough! Hodja Nasreddin considered such a death humiliating for a human being.

"No" - said he to himself and, gritting his teeth, repeated wrathfully: "No! I won't die today! I don't want to die!""

Comment author: NihilCredo 05 April 2011 09:49:45PM 13 points [-]

Note to self: do not buy stuff from Nancy Lebovitz.

Comment author: Tiiba 06 April 2011 02:27:01AM *  4 points [-]

Better yet, don't go gaga. And use anchoring to your advantage - before haggling, talk about something you got for free.

Comment author: Giles 01 April 2011 12:43:20PM 3 points [-]

"A group of Singularity Institute donors has stepped forward to match all donations given through Philanthroper today"

Does this make sense? How do we know they wouldn't have given that money anyway?

Comment author: Tiiba 01 April 2011 02:22:09PM 2 points [-]

Well, if the donations they have to match go beyond what they'd donate anyway, they would donate more than they otherwise would. Plus, the goal is to get YOU to donate more than you otherwise would.

Comment author: Tiiba 31 March 2011 05:26:02PM 3 points [-]

What's up with the word "foom", and why is it always in all caps? Can we come up with another name for this that doesn't sound like a sci-fi nerd in need of Ritalin?

Comment author: timtyler 28 March 2011 09:55:18PM *  0 points [-]

Well, not unmodified humans. You don't execute a 21st century jailbreak with spears and a loincloth. The outside world is not as resource-limited - and so it has some chance of gathering useful information from the attempt.

Comment author: Tiiba 28 March 2011 10:02:55PM 0 points [-]

And if they're modified? It's a superintelligent AI. You can't take it down with a shotgun, even if it's built into your arm.

Comment author: timtyler 28 March 2011 07:43:53PM 0 points [-]

You don't see why people would want to break into a compound containing the first machine intelligence?

Comment author: Tiiba 28 March 2011 09:02:04PM *  0 points [-]

Sure, but it's their funeral.

Another AI might succeed, but not humans. I think there would be at least a few weeks before another one appears, and that might be enough time to ask it how to make a true FAI.

Comment author: timtyler 28 March 2011 06:50:05PM 0 points [-]

One obvious problem will be people trying to break in. They have all the resources of the outside world to attempt that with.

Comment author: Tiiba 28 March 2011 07:27:34PM *  0 points [-]

Well, then they'll have themselves to blame when the AI converts their remains into nanomachines.

Not sure what you're saying.

Comment author: benelliott 27 March 2011 10:01:31AM 3 points [-]

Unfortunately it can't even limit itself to this. Every object with mass exerts a gravitational attraction on every other object, it can't help but affect the world outside through these means as well, so we have to allow it to do so, which may result in disaster for all we know. We also have to allow some radiation out, since this is also unavoidable. At this point I should point out that detonating a nuclear warhead can probably be fit into the category of "emitting waste heat and radiation".

Comment author: Tiiba 27 March 2011 04:59:04PM 0 points [-]

I did mention explosions. And gravity? I don't see what it could do with gravity. Although I see that it could do something with vibration.

Comment author: saturn 27 March 2011 07:56:58AM 8 points [-]

The idea of not affecting anything except this or that is a concept that only exists in fuzzy human folk ontology; physics doesn't really work that way. You would essentially be instructing the AI not to exist.

Stated in more detail here.

Comment author: Tiiba 27 March 2011 08:50:41AM 0 points [-]

It's allowed to produce waste heat. I see no reason to let it make anything else. I know it can't actially cut itself off from the universe, but it shouldn't enjoy this fact.

Comment author: [deleted] 27 March 2011 04:55:22AM 0 points [-]

I suspect that giving an unfriendly superintelligent AI "ten thousand cubic meters" of space will probably mean the end of humanity. Though some of the other ideas here are good, this one is pretty worrisome.

In response to comment by [deleted] on AI that doesn't want to get out
Comment author: Tiiba 27 March 2011 05:44:25AM *  0 points [-]

Why? This is the whole point - to prevent it from interacting with anything not intentionally given to it.

View more: Prev | Next