Comment author: XFrequentist 01 August 2010 07:46:57PM *  21 points [-]

I'm intrigued by the idea of trying to start something like a PUA community that is explicitly NOT focussed on securing romantic partners, but rather the deliberate practice of general social skills.

It seems like there's a fair bit of real knowledge in the PUA world, that some of it is quite a good example of applied rationality, and that much of it could be extremely useful for purposes unrelated to mating.

I'm wondering:

  • if this is an interesting idea to LWers?
  • if this is the right venue to talk about it?
  • does something similar already exist?

I'm aware that there was some previous conversation around similar topics and their appropriateness to LW, but if there was final consensus I missed it. Please let me know if these matters have been deemed inappropriate.

Comment author: Violet 03 August 2010 06:34:15AM *  5 points [-]

If you want non-PC approaches there are two communities you could look into: sales-people and conning people. The second one actually has most of the how-to-hack-peoples minds. If you want a kinder version look at it titled "social engineering".

Comment author: Eliezer_Yudkowsky 24 July 2010 03:17:32AM 5 points [-]

Popularized evo-psych seems to be a lot like appealing that a certain way of life is "natural" and thus "good".

Does anyone actually do this? Do I just hang out in the wrong circles? Are there people who do this and yet I never talk to any of them or read anything they write?

Comment author: Violet 24 July 2010 07:26:13PM 3 points [-]

Yes, unfortunately people do this.

Comment author: Roko 23 July 2010 02:16:23PM 0 points [-]

As to the ufai, do you have some data that it is a likely result?

What do you mean by data?

It is consensus SIAI belief that P(uFAI) > P(FAI). The justification is that uFAI is a lot easier to make.

Comment author: Violet 23 July 2010 02:24:54PM 0 points [-]

I was meaning for "such an ufai that it will punish in such a selective fashion".

Comment author: Violet 23 July 2010 02:02:11PM 3 points [-]

How about simply having an inner circle of friends that share your preferences for existential risk mitigation? Don't go for average but rather a niche where you will thrive.

In the larger setting people don't know who you donate to so there is no weirdness signal from that.

As to the ufai, do you have some data that it is a likely result? Rather than an irrational fear. The ufai does not have any reason to not torture you even if you disposed all your income to create it. Why should it care who helped to mitigate existential risks with 0%, 10% or 95% of their income?

Comment author: Violet 23 July 2010 01:52:54PM 1 point [-]

Have you considered that some of us might have utility functions that do have terms for socially distant people? Thus the charity can give direct utility to us, which seems ignored by the analysis.

Second, end points rarely are optimal. E.g. eating only tuna and nothing else could be unhealthy and weird, but that does not imply that eating some tuna is unhealthy or weird. Thus your analysis seems to miss the obvious answer.

Comment author: Violet 23 July 2010 08:36:09AM 15 points [-]

The Sex at Dawn story is nice but the whole debate seems backwards.

Everyone picks their favorite modern social models and then molds citations and stories to support that it must be natural and even the ancient hunter gatherers...

Popularized evo-psych seems to be a lot like appealing that a certain way of life is "natural" and thus "good".

btw Is there a name to the "natural -> good" bias/fallacy?

In response to comment by Violet on Fight Zero-Sum Bias
Comment author: Thomas 18 July 2010 10:31:18AM 5 points [-]

Sure. You can imagine two seemingly equal tribes. One with much more advanced status structure, where the chief is more revered, where there is a shaman with his own charisma or high status, where every member has it's own higher then zero place. A kind of Vanity Fair, but non the less.

And we can play this game of status in a smaller groups as well. Vote me up, I'll vote you up and we will both gain the status. We will cut together a little bigger piece of karma cake for us.

A nationalist leader may tell his people, that they are special. If they decide to believe him, the status of everybody will go up. At least they will think so, but it's all that counts in the status game, anyway.

In response to comment by Thomas on Fight Zero-Sum Bias
Comment author: Violet 19 July 2010 08:54:19PM 1 point [-]

I think the issue is whether to use "relative status" or "absolute status".

For example using the karma example, it is not very important what the karma numbers are absolutely but what their relative value is. Thus a couple of friends voting each other up raise the average (+mode + whatever statistical marker one prefers). Thus while their absolute status rises the relative status of other people sinks.

I think we may have different notions of status with me thinking of "relative inside a given group".

Comment author: Thomas 18 July 2010 08:47:18AM 2 points [-]

The status game is not entirely zero-sum either.

In response to comment by Thomas on Fight Zero-Sum Bias
Comment author: Violet 18 July 2010 10:12:52AM 2 points [-]

Could you elaborate or point to a link about status being positive sum?

Comment author: sixes_and_sevens 29 June 2010 02:52:58PM 12 points [-]

I should probably provide a corollary to this. It's an interesting question and deserves more than a pithy one-word response.

Logistics:

It is difficult enough to coordinate the work diaries, social calendars, birthdays, anniversaries, dietary requirements, travel plans, in-laws, etc. of two reasonably busy people who live in close proximity to one another. The more people and locations you add, the more it compounds any orchestration problem.

Economics:

I claim romantic relationships do not enjoy the benefits of economies of scale, and the overhead of each additional relationship actually increases logarithmically. I also claim additional partners are subject to diminishing returns. In fairness, if this is accurate, it is less of a case against polyamory and more of a case against an arbitrarily high number of partners. Still, it's not unreasonable to suggest that the optimal number of typical partners for a given person is between 0 and 2.

"Love Anarchy":

Much like the international system, my lovelife has no police force. I am generally quite pleased with this state of affairs. In a monogamous relationship my partner and I each have a single trade partner for our romantic resources. The quantity of those resources may or may not be to our exact liking, but the distribution is not contested. This is a relatively stable system. Once a third (or fourth, or fifth...) party becomes involved, we have a negotiation problem.

There's no feasible method for someone to commit to a set distribution of time/effort/attention between partners. I'm not saying there should be, just pointing out that such things can't realistically be budgeted for or enforced. The absence of such a mechanism makes polyamory highly unstable compared to monogamy, though I suppose this only really sits in the pro-monogamy column if you place a premium on stability.

Comment author: Violet 29 June 2010 07:57:01PM 5 points [-]

Actually the logistics is not so clear-cut.

Lets say Sarah has two partners Tom and Maria. Now Sarah has the wednesday afternoon free. The probablity that one of her partners has free time is higher than it would be in a monogamous arrangement.

The time needed is not necassary "everyone needed" but for "some suitable combination of people".

In response to Abnormal Cryonics
Comment author: Violet 27 May 2010 08:58:20AM *  4 points [-]

I am not liking long term cryonics for the following reasons: 1) If an unmodified Violet would be revived she would not be happy in the far future 2) If a Violet modified enough would be revived she would not be me 3) I don't place a large value on there being a "Violet" in the far future 4) There is a risk of my values and the values of being waking Violet up being incompatible, and avoiding possible "fixing" of brain is very high priority 5) Thus I don't want to be revived by far-future and death without cryonics seems a safe way for that

View more: Prev | Next