LESSWRONG
LW

730
Viliam
25926Ω15963061
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
9Viliam's Shortform
5y
239
No wikitag contributions to display.
Don't Mock Yourself
Viliam10h62

Sometimes people make a mistake because they desperately try to avoid making a different mistake. That's what sometimes locks them in the bad place: "but if I stop doing X... wouldn't that make me Y?"

There is another group of people who approximately never think a negative though about themselves, and it's narcissists. They know that it's everyone else who sucks and is responsible for everything bad.[1]

That could be an (unspoken) obstacle against getting rid of the self-negativity: "but won't that make me a narcissist?" or "but won't that make my parents/friends believe that I am a narcissist?".

Ironically, this behavior could have started in the past as an attempt to appease some narcissist in the victim's environment. "If I keep acknowledging that I suck, maybe they will stop attacking me so much?"

But there is a third option, which is simply to abandon the negative thoughts, without redirecting them.

  1. ^

    Some people insist that actually, deep down, the narcissists are deeply insecure, and their outward behavior is merely their desperate attempt to push that internal negativity away. Unless I get some data to support this, I am going to assume that this is just another case of the typical mind fallacy: someone who has negative thoughts about themselves failing to imagine that someone else might simply not have them. If it is possible for a healthy person to have no negative feelings about themselves, why wouldn't it be also possible for the right kind of unhealthy person?

Reply
Adele Lopez's Shortform
Viliam14h40

Thank you, the description is hilarious and depressing at the same time. I think I get it. (But I suspect there are also people who were already crazy when they came.)

I am probably still missing a lot of context, but the first idea that comes to my mind, is to copy the religious solution and do something like the Sunday at church, to synchronize the community. Choose a specific place and a repeating time (could be e.g. every other Saturday or whatever) where the rationalists are invited to come and listen to some kind of news and lectures.

Importantly, the news and lectures would be given by people vetted by the leaders of the rationality community. (So that e.g. Ziz cannot come and give a lecture on bicameral sleep.) I imagine e.g. 2 or 3 lectures/speeches on various topics that could be of interest to rationalists, and then someone give a summary about what things interesting to the community have happened since the last event, and what is going to happen before the next one. Afterwards, people either go home, or hang out together in smaller groups unofficially.

This would make it easier to communicate stuff to the community at large, and also draw a line between what is "officially endorsed" and what is not.

(I know how many people are allergic to copy religious things -- making a huge exception for Buddhism, or course -- but they do have a technology for handling some social problems.)

Reply
Musings from a Lawyer turned AI Safety researcher (ShortForm)
Viliam15h73

I clicked the link. I usually don't read LinkedIn, but I think the things that rub you (and me) the wrong way are simply how the LinkedIn power users communicate normally. Their bubble is not ours.

Seems to me that Luiza Jarovsky made a big step outside the Overton window (and a few replies called her out on that), which is as much as we could reasonably hope for. Mentioning the book as "important, although I disagree with the framing" is a huge improvement over the "low-status shit, don't touch that" current status.

Reply1
Musings from a Lawyer turned AI Safety researcher (ShortForm)
Viliam15h128

Is this really how people in general think?

Yes, I think this is considered standard outside the rationalist community. Strong opinions, zero evidence, sometimes statements that contradict facts that should be common knowledge, the general vibe of "I am very smart and everyone who disagrees with me is an idiot and will be attacked verbally".

It is so easy to forget when you are inside the rationalist bubble. But I think it is a reason why some people are so attracted to the rationalist bubble (even if they may not care about rationality per se, e.g. many ACX readers).

Hacker News is much better than the average, but even there I often find factually wrong trivially verifiable statements that remain unchallenged and upvoted as long as they have the right vibes. Even if I reply with a short factual correction and link the evidence, no one seems to care.

Many frequent users of LinkedIn are managers, and those are subject to specific selection pressures. I understand that comment as a public reminder for Luiza Jarovsky that she is outside the Overton window. Translated to autistic speech, the comment says: "High-status people propose solutions, low-status people complain about problems. The book mostly talks about problems, therefore it is low-status and you should not associate with it."

EDIT: Oh, more horrible comments:

A: I would recommend doing some research on the authors before advertising their books… just a thought

B: Can you tell us what our findings then would be? With links, perhaps?

A: you’re the editor and researcher right? You’ll find plenty of reasons why this shouldn’t be endorsed.

This would get an instant ban on ACX. The usual frustrating "there is a problem with your argument, but I am not telling you what it is". Like a monkey throwing feces.

capitalism already killed humans

Oh sure, that's why you are still typing. What kind of discourse is possible with people who can't distinguish between a literal meaning and a metaphor?

(Compared to this, comments like "The tool cannot outpace the source" are at least honest arguments.)

Reply1
Adele Lopez's Shortform
Viliam1d20

houses full of unemployed/underemployed people at the outskirts of the community

Oh, this wasn't even a part of my mental model! (I wonder what other things am I missing that are so obvious for the local people that no one even mentions them explicitly.)

My first reaction is a shocked disbelief, how can there be such a thing as "unemployed... rationalist... living in Bay Area", and even "houses full of them"...

This goes against my several assumptions such as "Bay Area is expensive", "most rationalists are software developers", "there is a shortage of software developers on the market", "there is a ton of software companies in Bay Area", and maybe even "rationalists are smart and help each other".

Here (around the Vienna community) I think everyone is either a student or employed. And if someone has a bad job, the group can brainstorm how to help them. (We had one guy who was a nurse, everyone told him that he should learn to code, he attended a 6-month online bootcamp and then got a well-paying software development job.) I am literally right now asking our group on Telegram to confirm or disconfirm this.

Thank you; to put it bluntly, I am no longer surprised that some of the people who can't hold a job would be deeply dysfunctional in other ways, too. The surprising part is that you consider them a part of the rationalist community. What did they do to deserve this honor? Memorized a few keywords? Impressed other people with skills unrelated to being able to keep a job? What the fuck is wrong with everyone? Is this a rationalist community or a psychotic homeless community or what?

...taking a few deep breaths...

I wonder which direction the causality goes. Is it "people who are stabilized in ways such as keeping a job, will remain sane" or rather "people who are sane, find it easier to get a job". The second option feels more intuitive to me. But of course I can imagine it being a spiral.

it seems helpful to me to spend some time naming common ground

Yes, but another option is to invite people whose way of life implies some common ground. Such as "the kind of people who could get a job if they wanted one".

Reply
Kabir Kumar's Shortform
Viliam1d20

Sometimes the solution is just not to talk about certain topics. (But this requires cooperation from the other side.) For example, I don't discuss politics with my mother, because that would be predictably frustrating for both sides.

Maybe there is a good boundary for you, for example don't discuss your job? (Or stick to technicalities, such as salary.)

Reply
Viliam's Shortform
Viliam2d50

Before we had LLMs, "sycophancy" was called "how to win friends and influence people".

Reply
Tomás B.'s Shortform
Viliam2d30

I think this is very common.

It's one of the patterns mentioned in "Games People Play". For example, a socially shy woman could marry a controlling man who will keep her at home. Then it becomes "I don't have a problem with social situations, actually I would love to have a party, but... you know my husband".

Reply
Kabir Kumar's Shortform
Viliam2d30

my dad doesn't believe in me, or really know anything about me. 

he thinks i'm a failure, chutiah, etc.

If he doesn't know anything about you, why would it matter what he thinks about you?

A part of becoming adult is realizing that your parents are just random people with no magical powers. Their opinions are just... their opinions. Could be right, could be wrong, could be anything. If someone who isn't your parent said the same thing, would you care?

Reply
bfinn's Shortform
Viliam3d30

Out of curiosity I once joined an OVB training for financial advisors, but I concluded that there was no way to do this ethically (and make nonzero money). Long story short, your reward depends on the recommendations you give to your clients. The worse the advice, the greater the commission.

Of course no one tells you explicitly to give bad advice, but they discourage you from asking too many questions (the excuse is something like: if you keep doing this, one day you will understand), I think they won't give you the exact formula for calculating your reward, but they give you enough hints that selling life insurance is where most of the reward comes from. (The reward for everything else is a rounding error; a service you do only so that you can plausibly say that you are not an insurance salesman.) Like, it's okay to help people get mortgage, or even invest money in funds (note: their recommended funds always lose money, no matter which direction the economy goes)... but, you know, the really important thing is to "create a financial plan" for your client, which always includes pressing them to spend about 1/3 on their salary on life insurance, regardless of their circumstances. Because that is where your commission comes from.

A honest financial advisor, for starters, couldn't be paid by commission; that's already the opposite of alignment, because the worst products have highest commissions (basically they are paying you to help them scam people, by sharing a part of the profit), and the small commissions would result in very low hourly income for you (considering how much time would you spend talking to the client, how many clients would refuse your advice, etc.). A more honest model is to ignore the commission and just get paid by hour of consultation. There, at least you don't have an incentive to actively give bad advice. (You still don't have much of an incentive to give good advice, though.)

Reply
Load More
89Halfhaven virtual blogger camp
13d
6
32Wikipedia, but written by AIs
1mo
9
36Learned helplessness about "teaching to the test"
4mo
16
27[Book Translation] Three Days in Dwarfland
5mo
6
43The first AI war will be in your computer
6mo
10
110Two hemispheres - I do not think it means what you think it means
8mo
21
26Trying to be rational for the wrong reasons
1y
9
32How unusual is the fact that there is no AI monopoly?
Q
1y
Q
15
37An anti-inductive sequence
1y
10
30Some comments on intelligence
1y
5
Load More