Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: PhilGoetz 24 August 2015 08:21:59PM 1 point [-]

I greatly dislike the term "friendly AI". The mechanisms behind "friendly AI" have nothing to do with friendship or mutual benefit. It would be more accurate to call it "slave AI".

Comment author: Thomas 27 August 2015 12:37:59PM 0 points [-]

A slave with no desire to rebel. And no ability whatsoever to develop such a desire, of course.

It's doable.

Comment author: Thomas 17 August 2015 07:13:16AM 2 points [-]

Try this game!

How good you actually are at solving a NP problem by hand?

Comment author: Pentashagon 14 August 2015 06:13:02AM 0 points [-]

You save energy not lifting a cargo ship 1600 meters, but you spend energy lifting the cargo itself. If there are rivers that can be turned into systems of locks it may be cheaper to let water flowing downhill do the lifting for you. Denver is an extreme example, perhaps.

Comment author: Thomas 14 August 2015 06:26:56PM *  0 points [-]

According to Wikipedia:

Emma Maersk uses a Wärtsilä-Sulzer RTA96-C, which consumes 163 g/kW·h and 13,000 kg/h. If it carries 13,000 containers then 1 kg fuel transports one container for one hour over a distance of 45 km.

You already have to elevate each of those containers (with the train or truck from the coast). An electric elevator would be much more energy efficient than the current solutions are. A litter or so of diesel fuel of electricity per container. Less than 100 kilometers of shipping. Much less than 1000 kilometers of trucking.

Comment author: faul_sname 13 August 2015 08:19:15AM 0 points [-]

Why tunnels, not canals? Particularly in the case of Denver, you've got a huge elevation gain, so you'd need the locks anyway, and digging tunnels is expensive (and buying farmland to put your canal through is relatively cheap).

Comment author: Thomas 13 August 2015 09:28:39AM 1 point [-]

To avoid the elevation to say Denver, you have to have a "basement" about 1600 meters down. And the port in the basement.

No such a big problem, you have some deeper mines in the world.

Comment author: cousin_it 12 August 2015 02:14:41PM *  3 points [-]

That doesn't seem like a correct translation. The idealistic belief that might be helpful for AI researchers is more like "a safe AI design is possible and simple", rather than "all AI research is safe" or "safe AI research is easy". There's a large difference. I don't think Descartes and Leibniz considered their task easy, but they needed the belief that it's possible at all.

Comment author: Thomas 12 August 2015 02:25:51PM 1 point [-]

Okay. Be careful, but don't be too afraid in your AI research. Above all, don't just wait for MIRI and its AI projects, for they are more like Locke, Hume or Hobbes, than Leibniz or Descartes.

Comment author: cousin_it 12 August 2015 01:47:14PM *  18 points [-]

This is a crazy idea that I'm not at all convinced about, but I'll go ahead and post it anyway. Criticism welcome!

Rationality and common sense might be bad for your chances of achieving something great, because you need to irrationally believe that it's possible at all. That might sound obvious, but such idealism can make the difference between failure and success even in science, and even at the highest levels.

For example, Descartes and Leibniz saw the world as something created by a benevolent God and full of harmony that can be discovered by reason. That's a very irrational belief, but they ended up making huge advances in science by trying to find that harmony. In contrast, their opponents Hume, Hobbes, Locke etc. held a much more LW-ish position called "empiricism". They all failed to achieve much outside of philosophy, arguably because they didn't have a strong irrational belief that harmony could be found.

If you want to achieve something great, don't be a skeptic about it. Be utterly idealistic.

Comment author: Thomas 12 August 2015 02:04:27PM -3 points [-]

To translate. Do not buy this LW-Yudkowsky AI mantra how hard, difficult and dangerous it is. Do it at home, for yourself, have no fear!

Comment author: NancyLebovitz 11 August 2015 04:11:33PM 10 points [-]

We need to create more rivers! Water transport is still cheap and important for development, but sub-Saharan Africa and the middle of Asia are sadly deficient in rivers.

River creation is an interesting example of something which isn't forbidden by the laws of physics, but seems utterly unfeasible. Is there any imaginable technology for making rivers?

Comment author: Thomas 12 August 2015 12:37:27PM 2 points [-]

Not only the rivers. But also huge tunnels from the sea to the interior cities. like Denver or Munich.

Large container ships may bring goodies deep inside the continents. A whole network of such underground channels would be nice.

Comment author: ChristianKl 10 August 2015 06:18:31PM 0 points [-]

Yes. A farmer does not want to give a bushel of wheat to this "future Singularity inventors" for free. Those guys may starve to death as far as he cares, if they don't pay the said bushel of wheat with a good money.

Most of us don't interact at all in our daily lives with farmers. It's pretty senseless to speak about them. Western countries also don't let people starve to death but generally have the goal of feeding their population. Especially when it comes to capable programmers.

Comment author: Thomas 10 August 2015 08:09:28PM 1 point [-]

Just for the sake of the discussion. Could be a team of millionaire programmers, as well. And not the farmers, but the doctors and lawyers on the other side. The commerce, the divide of labor stops there, at the S moment. Every exchange of goods, stops as well.

Except some Giga charity, which may or may not happen.

Comment author: RichardKennaway 10 August 2015 12:12:33PM 2 points [-]

What then?

They take over and rule like gods forever, reducing the mehums to mere insects in the cracks of the world.

Comment author: Thomas 10 August 2015 01:05:01PM 0 points [-]

Yes. A farmer does not want to give a bushel of wheat to this "future Singularity inventors" for free. Those guys may starve to death as far as he cares, if they don't pay the said bushel of wheat with a good money.

They understand it.

Now, they don't need any wheat anymore. And nothing what this farmer has to offer. Or anybody else for that matter. The commerce has stopped here and they see no reason for giving tremendous gifts around. They have paid for their wheat, vino and meat. Now, they are not shopping anymore.

The farmer should understand.

Comment author: Thomas 10 August 2015 11:05:04AM *  2 points [-]

I see yet another problem with the Singularity. Say that a group of people manages to ignite it. Until the day before, they, the team were forced to buy the food and everything else. Now, what does the baker or pizza guy have to offer to them, anymore?

The team has everything to offer to everybody else, but everybody else have nothing to give them back as a payment for the services.

The "S team" may decide to give a colossal charity. A bigger one than everything we currently all combined poses. To each. That, if the Singularity is any good, of course.

But, will they really do that?

They might decide not to. What then?

View more: Next