You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Gleb_Tsipursky comments on Marketing Rationality - Less Wrong Discussion

28 Post author: Viliam 18 November 2015 01:43PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (220)

You are viewing a single comment's thread. Show more comments above.

Comment author: Gleb_Tsipursky 19 November 2015 12:23:17AM 6 points [-]

They are intended to not appeal to you, and that's the point :-) If something feels cognitively easy to you and does not make you cringe at how low-level it is, then you are not the target audience. Similarly, you are not the target audience if something is overwhelming for you to read. Try to read them from the perspective of someone who does not know about rationality. A sample of evidence: this article was shared over 2K times by its readers, which means that tens and maybe thousands of people read it.

Comment author: Lumifer 19 November 2015 04:50:11AM 0 points [-]

If something feels cognitively easy to you and does not make you cringe at how low-level it is

I don't cringe at the level. I cringe at the slimy feel and the strong smell of snake oil.

Comment author: Tem42 20 November 2015 01:28:10AM 5 points [-]

It might be useful to identify what exactly trips your snake-oil sensors here. Mine were tripped when it claimed to be science based but referenced no research papers, but other than that it looked okay to me.

Unless you mean simply the site that it is posted on smells of snake oil. In that case I agree, but at the same time, so what? The people that read articles on that site don't smell snake oil, whether they should or not. If the site provides its own filter for its audience, that only makes it easier for us to present more highly targeted cognitive altruism.

Comment author: Gleb_Tsipursky 20 November 2015 04:38:12AM 2 points [-]

To clarify about the science-based point, I tried to put in links to research papers, but unfortunately the editors cut most of them out. I was able to link to one peer-reviewed book, but the rest of the links had to be to other articles that contained research, such as this one from Intentional Insights itself.

Yup, very much agreed on the point of the site smelling like snake oil, and this enabling highly targeted cognitive altruism.

Comment author: Gleb_Tsipursky 19 November 2015 05:04:56AM *  3 points [-]

Yup, I hear you. I cringed at that when I was learning how to write that way, too. You can't believe how weird that feels to an academic. My Elephant kicks and screams and tries to throw off my Rider whenever I do that. It's very ughy.

However, having calculated the trade-offs and done a Bayesian-style analysis combined with a MAUT, it seems that the negative feelings we at InIn get, and mostly me at this point as others are not yet writing these types of articles for fear of this kind of backlash, are worth the rewards of raising the sanity waterline of people who read those types of websites.

Comment author: Lumifer 19 November 2015 04:51:43PM 3 points [-]

I cringed at that when I was learning how to write that way, too.

So, why do you think this is necessary? Do you believe that proles have an unyielding "tits or GTFO" mindset so you have to provide tits in order to be heard? That ideas won't go down their throat unless liberally coated in slime?

It may look to you like you're raising the waterline, but from the outside it looks like all you're doing is contributing to the shit tsunami.

for fear of this kind of backlash

I think "revulsion" is a better word.

Wasn't there a Russian intellectual fad, around the end of XIX century, about "going to the people" and "becoming of the people" and "teaching the people"? I don't think it ended well.

are worth the rewards of raising the sanity waterline

How do you know? What do you measure that tells you you are actually raising the sanity waterline?

Comment author: Gleb_Tsipursky 19 November 2015 11:49:20PM 1 point [-]

Look, we can choose to wall ourselves off from the shit tsunami out there, and stay in our safe Less Wrong corner. Or we can try to go into the shit tsunami, provide stuff that's less shitty than what people are used to consuming, and then slowly build them up. That's the purpose of Intentional Insights - to reach out and build people up to growing more rational over time. You don't have to be the one doing it, of course. I'm doing it. Others are doing it. But do you think it's better to improve the shit tsunami or put our hands in our ears and pretend it's not there and not do anything about it? I think it's better to improve the shit tsunami of Lifehack and other such sites.

The measures we use and the methods we decided on and our reasoning behind them is described in my comment here.

Comment author: Lumifer 20 November 2015 04:41:53PM 2 points [-]

Look, we can choose to wall ourselves off from the shit tsunami out there, and stay in our safe Less Wrong corner. Or we can try to go into the shit tsunami, provide stuff that's less shitty than what people are used to consuming, and then slowly build them up.

Well, first of all I can perfectly well stay out of the shit tsunami even without hiding in the LW corner. The world does not consist of two parts only: LW and shit.

Second, you contribute to the shit tsunami, the stuff you provide is not less shitty. It is exactly what the tsunami consists of.

That's the purpose ... it's better to improve the shit tsunami

The problem is not with the purpose. The problem is with what you are doing. Contributing your personal shit to the tsunami does not improve it.

The measures we use

You measure, basically, impressions -- clicks and eyeballs. That tells you whether the stuff you put out gets noticed. It does not tell you whether that stuff raises the sanity waterline.

So I repeat: how do you know?

Comment author: Gleb_Tsipursky 23 November 2015 01:10:30AM *  -1 points [-]

the stuff you provide is not less shitty. It is exactly what the tsunami consists of

Do you truly believe the article I wrote was no less shitty than the typical Lifehack article, for example this article currently on their front page? Is this what a reasonable outside observer would say? I'm willing to take a $1000 bet that more than 5 out of 10 neutral reasonable outside observers would evaluate my article as higher quality. Are you up for that bet? If not, please withdraw your claims. Thanks!

Comment author: Lumifer 23 November 2015 03:48:34PM 0 points [-]

I am not terribly interested in distinguishing the shades of brown or aroma nuances. To answer your question, yes, I do believe you wrote a typical Lifehack article of the typical degree of shittiness. In fact, I think your mentioned on LW your struggles in producting something sufficiently shitty for Lifehack to accept and, clearly, you have succeeded in achieving the necessary level.

As to the bet, please specify what is a "neutral reasonable" observer and how do you define "quality" in this context. Also, do I take it you are offering 1:1 odds? That implies you believe the probability you will lose is just under 50%, y'know...

Comment author: gjm 23 November 2015 08:45:01PM 2 points [-]

That implies you believe the probability you will lose is just under 50%

Only if $1000 is an insignificant fraction of Gleb's wealth, or his utility-from-dollars function doesn't show the sort of decreasing marginal returns most people's do.

Comment author: Gleb_Tsipursky 24 November 2015 09:56:18PM -1 points [-]

Indeed, $1000 is a quite significant portion of my wealth.

Comment author: Gleb_Tsipursky 24 November 2015 09:55:56PM *  -1 points [-]

$1000 is not an insignificant portion of my wealth, as gjm notes. I certainly do not want to lose it.

We can take 10 LessWrongers who are not friends with you or I and have not participated in this thread and do not know about this debate as neutral observers. Should be relatively easy to gather through posting on the open thread or elsewhere.

We can have gjm or another external observer recruit people just in case one of us doing it might bias the results.

So, going through with it?

Comment author: Lumifer 24 November 2015 10:01:29PM 1 point [-]

Sorry, I don't enjoy gambling. I am still curious about "quality" which you say your article has and the typical Lifehacker swill doesn't. How do you define that "quality"?