Comment author: Raythen 18 December 2014 12:22:15PM 0 points [-]

What language do you use in the meetups? I'm thinking of coming, but I don't speak Danish, only English and Swedish.

Comment author: ephion 03 June 2014 01:42:02PM 2 points [-]

Therapy or psychiatry seem like good fields to go into here.

Comment author: Raythen 09 June 2014 12:07:35PM 0 points [-]

Yeah. I have considered that.

There's overlap between empathy and therapy/psychiatry, but also important differences.

Though working with some kind of therapy might suit my personality, and the way I want to work.

Comment author: ChristianKl 05 June 2014 02:27:39PM 2 points [-]

The fact that you did all in English suggests that you don't do it with your meetspace friends. Ask your meetspace friends.

If you want to be a coach or therapists you need to be in a position to gather clients. Empathy alone is not enough. You also need to do outreach to find clients.

Comment author: Raythen 09 June 2014 11:40:35AM 0 points [-]

I mostly agree with this.

Right now I live in a small town, and my meatspace friends don't need the particular kind of support I can offer. Outreach/community is one of the reasons, maybe the primary reason I'm considering to do studies (academic, certification or something similar).

Still, the internet seems like a viable way to connect with people.

Comment author: AndyWood 04 June 2014 07:56:28AM 2 points [-]

Look for people who need your kindness. Things will open up for you.

Comment author: Raythen 04 June 2014 10:48:12AM *  0 points [-]

I like the way you phrased that :)

Wondering what to do with my ability for empathy and understanding people. Have some experience and perhaps opportunity to work with this professionally - advice?

1 Raythen 03 June 2014 09:57AM
(I've intentionally tried to keep this post concise. Please ask if you want more details about something) 
 
I've done some free NVC-based empathy work starting two years ago (online, via Skype call). 
(NVC is a communication method; the specific methods used are empathic listening and reflection). 
 
Lately people told me I'm really good at it, and should with it professionally. I think I would enjoy that, and it does match my career aspirations. 
 
I'm going to have free time over the summer to do something with this if I want to. 
 
(I'm 25. I live in Sweden) 
 
---
 
My general ideas are along the lines of... 
 
Get more information, potentially connect with people who already work with this and/or similar topics. 
 
Research study/certification options - there are some, though I haven't found one that seems like a good fit for me yet. 
 
Register a corporation - since I already do work that generates value for others, I could open the door to doing it professionally. (Also, the registration is free of charge) 
 
---
 
I'm not equally good at getting a read for people who I know or converse with only casually (not enough data). 
 
Slow text-only communication (e-mail etc) - can't iterate fast enough. Text chat... might work but loses a lot of bandwidth compared to voice. Voice works best (or doing it in person). 
 
---
 
I am quite good at understanding people even with few things in common, and across quite sensitive topics (which requires a non-judgemental non-critical approach). The only requirement I guess is that the person actually wants to be deeply understood, and have that understanding reflected back verbally. 
 
---
 
Some reported benefits 
Increased clarity and self-understanding 
Inspiration and clarity when it comes to specific goals and actions 
Increased awareness of own values 
 
---
 
So I'm wondering what to do with all this. I could use some thoughts/advice - I the share the rationalist viewpoint on life and most of the rationalist values. 
 
Some additional info 
I am a Swedish citizen, which gives me free passage and right of residence within the European Union. I speak good English (and all the empathy work I've done has been in English). I am potentially open to relocating - Sweden has less than 10 million residents so I figure I might need to at some point (though I'd rather stick to English-speaking countries. I have a slight preference for warmer climates). 
Comment author: Raythen 28 May 2014 09:01:16AM *  3 points [-]

Narcissism and narcissistic parenting are very real (and hard-to-detect) problems, with potentially serious long-term consequences, so I think it's good that you brought this up.

You might also want to see http://www.reddit.com/r/raisedbynarcissists/

(as stated in another comment, though - I really don't see Harry as being narcissistic)

Comment author: ChristianKl 28 May 2014 05:23:01AM 1 point [-]

I think you are plain wrong.

There a lot of thought in AI development of mimicking human neural decision making processes and it's very well possible that the first human level AGI will be similar in structure to human decision making. Emotions are a core part of how humans make decisions.

Comment author: Raythen 28 May 2014 08:53:32AM *  0 points [-]

I should probably make clear that most of my knowledge of AI comes from LW posts, I do not work with it professionally, and that this discussion is on my part motivated by curiosity and desire to learn.

Emotions are a core part of how humans make decisions.

Agreed.


Your assessment is probably more accurate than mine.

My original line was of thinking was that while AIs might use quick-and-imprecise thinking shortcuts triggered by pattern-matching (which is sort of how I see emotions), human emotions are too inconveniently packaged to be much of use in AI design. (While being necessary, they also misfire a lot; coping with emotions is an important skill to learn; in some situations emotions do more harm than good; all in all this doesn't seem like good mind design). So I was wondering if whatever AI uses for its thinking, we would even recognize as emotions.

My assessment now is that even if AI uses different thinking shortcuts than humans do, they might still misfire. For example, I can imagine a pattern activation triggering more patterns, which in turn trigger more and more patterns, resulting in a cascade effect not unlike emotional over-stimulation/breakdown in humans.
So I think it's possible that we might see AI having what we would describe as emotions (perhaps somewhat uncanny emotions, but emotions all the same).


P. S. For the sake of completeness: my mental model also includes biological organisms needing emotions in order to create motivation (rather than just drawing conclusions). (example: fear creating motivation to escape danger).
AI should already have a supergoal so it does not need "motivation". However it would need to see how its current context connects to its supergoal, and create/activate subgoals that apply to the current situation, and here once again thinking shortcuts might be useful, perhaps not too unlike human emotions.

Example: AI sees a fast-moving object that it predicts will intersect its current location, and a thinking shortcut activates a dodging strategy. This is a subgoal to the goal of surviving, which is in turn is a subgoal to the AI's supergoal (whatever that is).

Having a thinking shortcut (this one we might call "reflex" rather "emotion") results in faster thinking. Slow thinking might be inefficient to the point of fatal "Hm... that object seems to be moving mighty fast in my direction... if it hits me it might damage/destroy me. Would that be a good thing? No, I guess not - I need to functional in order to achieve my supergoal. So I should probably dodg.. <CRASH>"

Comment author: Algernoq 26 May 2014 07:04:28PM 3 points [-]

Thanks for looking for contradictory evidence. I must disagree with your examples, however, because none of them seem non-narcissistic. For example, a narcissistic car salesman can feel superior and swindle people without empathy, despite: 1. Having high self-awareness of his strengths and weaknesses as a salesman, 2. Understanding people (but not actually caring about them) well enough to sell to them, and 3. Accurately perceiving reality (understanding physics, peoples' motivations, how to drive to work, how to not act crazy, etc.).

Comment author: Raythen 26 May 2014 09:07:53PM *  0 points [-]
  1. Having high self-awareness of his strengths and weaknesses as a salesman

That's not at all the same thing as having high self-awareness overall

  1. Understanding people well enough to sell to them

Which may well be not very well at all. Understanding people in the context of sales is not the same as understanding then generally.

  1. Accurately perceiving reality (understanding physics, peoples' motivations, how to drive to work, how to not act crazy, etc.).

Which most people do. (though actually an accurate understanding of neither physics nor people's motivations is required to get by). I was taking about the "litany of Tarsky"-esqe desire to always seek the truth, even if unpleasant and emotionally painful to learn - which Harry has and very few people generally do.

Comment author: Raythen 26 May 2014 06:21:17PM *  -1 points [-]

Some of Harry's traits that strike me as strongly non-narcissistic:
high self-awareness - which appears genuine
capable of (correctly) understanding others' feelings and motivations - does not label or vilify people with very different values
desire to see all things as they really are, even if it's painful (while narcissists typically have delusions)

Comment author: Raythen 25 May 2014 11:46:07AM *  2 points [-]

Asking "Would an AI experience emotions?" is akin to asking "Would a robot have toenails?"

There is little functional reason for either of them to have those, but they would if someone designed them that way.

Edit: the background for this comment - I'm frustrated by the way AI is represented in (non-rationalist) fiction.

View more: Next