Comment author: RichardKennaway 12 January 2016 12:46:18PM 2 points [-]
  1. Keep the AI in a box and don't interact with it.

The rest of your posting is about how to interact with it.

Don't have any conversations with it whatsoever.

Interaction is far broader than just conversation. If you can affect it and it can affect you, that's interaction. If you're going to have no interaction, you might as well not have created it; any method of getting answers from it about your questions is interacting with it. The moment it suspects what it going on, it can start trying to play you, to get out of the box.

I'm at a loss to imagine how they would take over the world.

This is a really bad argument for safety. It's what the scientist says of his creation in sci-fi B-movies, shortly before the monster/plague/AI/alien/nanogoo escapes.

Comment author: ZoltanBerrigomo 12 January 2016 08:03:15PM *  0 points [-]

These are good points. Perhaps I should not have said "interact" but chosen a different word instead. Still, its ability to play us is limited since (i) we will be examining the records of the world after it is dead (ii) it has no opportunity to learn anything about us.

Edit: we might even make it impossible for it to game us in the following way. All records of the simulated world are automatically deleted upon completion -- except for a specific prime factorization we want to know.

This is a really bad argument for safety.

You are right, of course. But you wrote that in response to what was a parenthetical remark on my part -- the real solution is to use program checking to make sure the laws of physics of the simulated world are never violated.

Comment author: Manfred 12 January 2016 01:42:17AM 1 point [-]

Even if the huge computing or algorithmic advances needed fell out of the sky tomorrow, this scheme still doesn't seem like it solves the problems we really want it to solve, because it does not allow the the agents to learn anything interesting about our world.

Comment author: ZoltanBerrigomo 12 January 2016 05:59:07AM *  0 points [-]
  1. When talking about dealing and (non)interacting with real AIs, one is always talking about a future world with significant technological advances relative to our world today.

  2. If we can formulate something as a question about math, physics, chemistry, biology, then we can potentially attack it with this scheme. These are definitely problems we really want to solve.

  3. Its true that if we allow AIs more knowledge and more access to our world, they could potentially help us more -- but of course the number of things that can go wrong has to increase as well. Perhaps a compromise which sacrifices some of the potential while decreasing the possibilities that can go wrong is better.

Comment author: ChristianKl 09 January 2016 11:13:05AM 0 points [-]

but this does not change the fundamental fact that being rational involves evaluating claims like "is 1+1=2?" or empirical facts about the world such as "is there evidence for the existence of ghosts?" based on reason alone.

On of the claims is analytic. 1+1=2 is true by definition of what 2 means. There's little emotion involved.

When it comes to an issue such as is there evidence for the existence of ghosts? neither rationality after Eliezer's sequences nor CFAR argues that emotions play no role. Noticing when you feel the emotion of confusion because your map doesn't really fit is important.

Beauty of mathematical theories is a guiding stone for mathematicians.

Basically any task that doesn't need emotions or intuitions is better done by computers than by humans. To the extend that human's outcompete computers there's intuition involved.

Comment author: ZoltanBerrigomo 11 January 2016 03:34:17AM 1 point [-]

1+1=2 is true by definition of what 2 means

Russell and Whitehead would beg to differ.

Comment author: Kaj_Sotala 09 January 2016 01:06:37PM 3 points [-]

Being rational involves evaluating various claims and empirical facts, using the best evidence that you happen to have available. Sometimes you're dealing with a domain where explicit reasoning provides the best evidence, sometimes with a domain where emotions provide the best evidence. Both are information-processing systems that have evolved to make sense of the world and orient your behavior appropriately; they're just evolved for dealing with different tasks.

This means that in some domains explicit reasoning will provide better evidence, and in some domains emotions will provide better evidence. Rationality involves figuring out which is which, and going with the system that happens to provide better evidence for the specific situation that you happen to be in.

Comment author: ZoltanBerrigomo 11 January 2016 03:28:40AM *  1 point [-]

Sometimes you're dealing with a domain where explicit reasoning provides the best evidence, sometimes with a domain where emotions provide the best evidence.

And how should you (rationally) decide which kind of domain you are in?

Answer: using reason, not emotions.

Example: if you notice that your emotions have been a good guide in understanding what other people are thinking in the past, you should trust them in the future. The decision to do this, however, is an application of inductive reasoning.

Comment author: ChristianKl 04 January 2016 12:24:53PM *  2 points [-]

Being rational means many things, but surely one of them is making decisions based on some kind of reasoning process as opposed to recourse to emotions.

No. CFAR rationality is about aligning system I and system II. It's not about declaring system I outputs to be worthy of being ignored in favor of system II outputs.

You might, for example, have very strong emotions about matters pertaining to fights between your perceived in-group and out-group, but you try to put those aside and make judgments based on some sort of fundamental principles.

The alternative is working towards feeling more strongly for the fundamental principles than caring about the fights.

emotions are not easy to fake and humans have strong intuitions about whether someone's expressed feelings are genuine.

A person who cares strongly for his cause doesn't need to fake emotions.

Comment author: ZoltanBerrigomo 08 January 2016 11:55:50PM *  0 points [-]

No. CFAR rationality is about aligning system I and system II. It's not about declaring system I outputs to be worthy of being ignored in favor of system II outputs.

I believe you are nitpicking here.

If your reason tells you 1+1=2 but your emotions tell you that 1+1=3, being rational means going with your reason. If your reason tells you that ghosts do not exist, you should believe this to be the case even if you really, really want there to be evidence of an afterlife.

CFAR may teach you techniques to align your emotions and reason, but this does not change the fundamental fact that being rational involves evaluating claims like "is 1+1=2?" or empirical facts about the world such as "is there evidence for the existence of ghosts?" based on reason alone.

Just to forestall the inevitable objections (which always come in droves whenever I argue with anyone on this site): this does not mean you don't have emotions; it does not mean that your emotions don't play a role in determining your values; it does not mean that you shouldn't train your emotions to be an aid in your decision-making, etc etc etc.

Comment author: ChristianKl 04 January 2016 12:24:53PM *  2 points [-]

Being rational means many things, but surely one of them is making decisions based on some kind of reasoning process as opposed to recourse to emotions.

No. CFAR rationality is about aligning system I and system II. It's not about declaring system I outputs to be worthy of being ignored in favor of system II outputs.

You might, for example, have very strong emotions about matters pertaining to fights between your perceived in-group and out-group, but you try to put those aside and make judgments based on some sort of fundamental principles.

The alternative is working towards feeling more strongly for the fundamental principles than caring about the fights.

emotions are not easy to fake and humans have strong intuitions about whether someone's expressed feelings are genuine.

A person who cares strongly for his cause doesn't need to fake emotions.

Comment author: ZoltanBerrigomo 05 January 2016 05:51:22AM *  1 point [-]

Sure, you can work towards feeling more strongly about something, but I don't believe you'll ever be able match the emotional fervor the partisans feel -- I mean here the people who stew in their anger and embrace their emotions without reservations.

As a (rather extreme) example, consider Hitler. He was able to sway a great many people with what were appeals to anger and emotion (though I acknowledge there is much more to the phenomena of Hitler than this). Hypothetically, if you were a politician from the same era, say a rational one, and you understood that the way to persuade people is to tap into the public's sense of anger, I'm not sure you'd be able to match him.

Comment author: ChristianKl 02 January 2016 09:25:21PM 4 points [-]

I would suggest it requires more than just rationality + competence + caring; for one thing, it requires a little bit of luck

Do the extend that it does require luck that simply means that it's important to have more people with rationality + competence + caring. If you have many people some will get lucky.

Many such people respond to unreasonable confidence

I think the term "unreasonable confidence" can be misleading. It's possible to very confidently say "I don't know".

At the LW Community Camp in Berlin, I consider Valentine of CFAR to have been the most charismatic person in attendence. When speaking with Valentine, he said things like: "I think it's likely that what you are saying is true, but I don't see a reason why it has to be true." He also very often told people that he might be wrong and that people shouldn't trust his judgements as strongly as they do.

may be more difficult to produce the more you are used to thinking things through rationally.

I think you might be pattern matching to straw-vulcan rationality, that's distinct from what CFAR wants to teach.

Comment author: ZoltanBerrigomo 03 January 2016 05:42:10PM *  1 point [-]

Do the extend that it does require luck that simply means that it's important to have more people with rationality + competence + caring. If you have many people some will get lucky.

The "little bit of luck" in my post above was something of an understatement; actually, I'd suggest it requires a lot of luck (among many other things) to successfully change the world.

I think you might be pattern matching to straw-vulcan rationality, that's distinct from what CFAR wants to teach.

Not sure if I am, but I believe I am making a correct claim about human psychology here.

Being rational means many things, but surely one of them is making decisions based on some kind of reasoning process as opposed to recourse to emotions.

This does not mean you don't have emotions.

You might, for example, have very strong emotions about matters pertaining to fights between your perceived in-group and out-group, but you try to put those aside and make judgments based on some sort of fundamental principles.

Now if, in the real world, the way you persuade people is by emotional appeals (and this is at least partially true), this will be more difficult the more you get in the habit of rational thinking, even if you have an accurate model about what it takes to persuade someone -- emotions are not easy to fake and humans have strong intuitions about whether someone's expressed feelings are genuine.

In response to Why CFAR's Mission?
Comment author: ZoltanBerrigomo 01 January 2016 09:10:11PM *  4 points [-]

A very interesting and thought provoking post -- I especially like the Q & A format.

I want to quibble with one bit:

How can I tell there aren't enough people out there, instead of supposing that we haven't yet figured out how to find and recruit them?

Basically, because it seems to me that if people had really huge amounts of epistemic rationality + competence + caring, they would already be impacting these problems. Their huge amounts of epistemic rationality and competence would allow them to find a path to high impact; and their caring would compel them to do it.

There is an empirical claim about the world that is implicit in that statement, and it is this claim I want to disagree with. Namely: I think having a high impact on the world is really, really hard. I would suggest it requires more than just rationality + competence + caring; for one thing, it requires a little bit of luck.

It also requires a good ability to persuade others who are not thinking rationally. Many such people respond to unreasonable confidence, emotional appeals, salesmanship, and other rhetorical tricks which may be more difficult to produce the more you are used to thinking things through rationally.

Comment author: ZoltanBerrigomo 01 November 2015 07:02:07PM *  2 points [-]

For those people who insist, however, that the only thing that is important is that the theory agrees with experiment, I would like to make an imaginary discussion between a Mayan astronomer and his student...

These are the opening words of a ~1.5 minute monologue in one of Feynman's lectures; I won't transcribe the remainder but it can be viewed here.

Comment author: adamzerner 12 September 2015 01:37:29AM *  0 points [-]

I agree with your point about the value of appearing confident, and that it's difficult to fake.* I think it's worth bringing up, but I don't think it's a particularly large component of success. Depending on the field, but I still don't think there's really many fields where it's a notably large component of success (maybe sales?).

*I've encountered it. I'm an inexperienced web developer, and people sometimes tell me that I should be more confident. At first this has very slightly hurt me. Almost negligibly slight. Recently, I've been extremely fortunate to get to work with a developer who also reads LW and understands confidence. I actually talked to him about this today, and he mirrored my thoughts that with most people, appearing more confident might benefit me, but that with him it makes sense to be honest about my confident levels (like I have been).

Comment author: ZoltanBerrigomo 12 September 2015 06:23:05AM *  1 point [-]

Not sure...I think confidence, sales skills, and ability to believe and get passionate about BS can be very helpful in much of the business world.

View more: Prev | Next