Comment author: Jacobian 23 December 2015 06:01:03PM 5 points [-]

A question about donating:

AFAIK about half of the payment for attending a workshop (~$2,000) is considered a charitable donation, is tax-deductible etc. Would it be possible for me to donate to the winter fundraiser and have the donation amount deducted from my workshop participation fee if and when I choose to attend in the future?

I think it works out great for CFAR to allow this: either you get a pre-commited attendee or a free donation.

In response to LessWrong 2.0
Comment author: Jacobian 04 December 2015 08:44:58PM 28 points [-]

One more use I have for LessWrong: learning about subjects from people whom I trust to be smart and rational. A while back I wanted to learn up on perceptual control theory, I found RichardKenneways' and Vaniver's posts a hundred times better than Wikipedia.

This is an invaluable resource for me that I would hate to lose. Even if the quality of new stuff being written on LW is declining, the quality of stuff that I'm reading on LW is still consistently excellent. I really hope we would find a way to keep this aspect going.

In response to LessWrong 2.0
Comment author: btrettel 03 December 2015 04:09:54AM *  18 points [-]

You bring up a number of important points. Perhaps I missed this when reading, but one role LessWrong plays and continues to play is a good source of discussion. Often I'll find the discussion to be more interesting than a particular article. It's not uncommon for me to be linked to a particular comment divorced from its larger context and not be interested in the larger context. I don't know how common this behavior is, but this is not uncommon for me, and I don't think replacing the rationalist materials with a wiki or Q&A site would suit this well at all. This is one reason to favor something like Reddit.

I'm also generally not a fan of shutting down even semi-active forums. In one online community I've participated in, there were several major forum closures, and each time there was a period of confusion about what to do if you're interested in discussion, along with basically sectarian posturing to get active posters. The sectarian stuff caused major problems down the line, and the current discussion forum for this community more or less voluntarily avoids those conflicts now. There also are a number of roles LessWrong plays that I'm not sure would survive a transition to the diaspora, like the page about sharing academic papers. I also often enjoy reading the open threads. Perhaps transitioning LessWrong more towards discussion would be a good middle ground.

Edit: On a related note, I find following discussions on Tumblr to be a huge pain, and hope either this improves in the future or that more discussions happen elsewhere.

In response to comment by btrettel on LessWrong 2.0
Comment author: Jacobian 04 December 2015 08:35:26PM 10 points [-]

Agreed, the comments (fortified by the voting system) are a huge reason why I'm here. I bought Rationality A-Z for ease of reading, but discovered that I didn't like it at all without seeing the discussion spawned by every post. In particular, it is very easy to be convinced by a well-written but subtly flawed argument, unless an equally well written rebuttal is in the comments.

The voting system is something that I would hate to lose too, I am very impressed by the people here really upvoting based more on quality than on vacuous agreement. I've had my first three comments on the site and one of my first posts massively downvoted, and it hurt, but now I'm very happy for it.

In response to comment by cousin_it on LessWrong 2.0
Comment author: Vaniver 04 December 2015 12:27:01AM 4 points [-]

Announce that during the next year, LW will have one post per week, at a specified time. There will be an email address where anyone can send their submissions, whereupon a horribly secretive and biased group of editors will select the best one each week, aiming for Eliezer quality or higher.

Functionally, this is turning LW into a magazine with one article per week. I think that's a decent approach, though I have some reservations.

Remember the shift from OB to LW, and one of the big changes being that people went from having to email Hanson about posting something (and maybe getting shot down) to being able to post something themselves. I worry that this creates too much in the way of inconvenience and risk of failure for posters, and means that they'll post it somewhere else instead of on LW.

But I think the tournament nature of it--there's a post every week, and so we need people to contribute, and if your post doesn't make it (or gets waitlisted or so on) it's not because you're absolutely bad, just relatively bad--does improve the idea significantly.

I'm also not sure how well this plays with the fragmentation in interests of people in the community.

In response to comment by Vaniver on LessWrong 2.0
Comment author: Jacobian 04 December 2015 04:25:20PM *  1 point [-]

Re: fragmentation of interests. Posts on LessWrong seem to easily slide into a number of clear categories (epistemic rationality, fighting akrasia, decision theory math, social events...) It would be great if the site was organized to group posts together, so that if I don't know math and just want to follow the best self-help tips it would be easy to do so.

This can work very well with the "one post a week" idea, which I'm in favor of. Consistent schedule + high quality is what keeps people coming back. That's why so many webcomics religiously stick to their posting schedule (like XKCD's M-W-F). We can have a post every X days in each of 3-4 basic categories, so I'll know that one Wednesday is AI-post day, the next Saturday is akrasia post day, the next Wednesday is social post day etc.

The main challenge would be getting enough good posts, two thoughts on that:

  1. If the good writers contribute enough stuff upfront it can create a good buffer that will allow the editors to plan the best schedule, i.e. how many days between posts can be kept consistently.

  2. I think a lot of people are already intimidated about posting given the very high standards. If quality is a concern more than quantity, I don't think that people with something important to say will be too discouraged by having to submit to moderation. A lot of us have our own blogs, tumblrs, Facebooks etc. Since I know that LW has a much wider reach than my own blog, I wouldn't mind trying to "win the week" on LW first, and posting on my platform as a fallback if I don't make it.

  3. Waiting for moderation on what you wrote requires delaying gratification, which is very hard... but not something that a real rationalist would have trouble overcoming, no? ;-)

Comment author: Jacobian 06 November 2015 08:36:21PM 1 point [-]

Having just read this post for the first time has made me so happy!

Let me explain: Eliezer doesn't sound optimistic at all in this essay, especially compared to the gung-ho "we got this!" spirit of almost every other call to action post. And here I am, someone who's so new to LW I only just got to this post, and yet in the last year I have:

  • Enjoyed rationalist parties with dozens of people.
  • Went to over 20 weekly rationalist gatherings.
  • Ran an EA charity research meetup.
  • Got 3 other people hooked on the sequences.
  • Am taking my girlfriend to Soltice next month.

With all that I don't even consider myself a central part of the community (either LW or EA) just someone hanging around the margins and trying to contribute occasionally.

Here's another thing to keep in mind with the Catholic Church: they've been around for 2,000 years, and they're one of 2,000 cults that made it. Can it be that there's nothing going on besides time and survivor bias? 6 years after this post was written, Givewell is affecting millions and MIRI just raised $600k by itself. I don't know if we should really still be looking up to the church for organization tips.

Comment author: Jacobian 06 November 2015 03:46:24PM *  8 points [-]

Gleb, first of all I think that Intentional Insights is a great idea and I support the cause! Unfortunately, I don't think I'll be supporting the cause by wearing t-shirts with slogans. In particular, InIn (as I understand it) is aimed at the broader audience, while all those t-shirts are basically in-jokes for people already in the LW space and baffling to anyone else.

I second ChristianKI's idea of incorporating graphic designs, maybe an "I changed my mind today!" shirt in the style of this famous one.

I would also suggest making the slogans intriguing and humorous instead of straightforward, so they can start conversations and be less embarrassing to wear. Off the top of my head:

  • I was stupid yesterday instead of "Less wrong every day"
  • Super agenty instead of "Living on purpose"
  • Evidence or it doesn't happen instead of "Absence Of Evidence Is Evidence Of Absence"

None of the above are really good, but that's the direction I think the t-shirts should be going in.

Comment author: Gunnar_Zarncke 07 October 2015 06:04:37PM 0 points [-]

I don't get what this is about from the website - and I don't watch Youtube. Can anybody explain/link?

Comment author: Jacobian 08 October 2015 02:07:15PM 5 points [-]

It started with this comic and proceeded to happen exactly as it does in the comic.

In detail: people come up with evolutionary theories to explain human or animal characteristics. The theories have to be well supported and also 100% wrong (but mostly they have to be funny). My theory was that sleepwalking is an adaptation to get you in good shape, and it started off with Homo Erectus jogging after antelopes for days in persistence hunting. Like most modern jobs, persistence hunting doesn't require you to be actually awake.

From an LWian point of view, it's a master class on confirmation bias - it's frighteningly easy to find supportive evidence for the dumbest hypotheses in every chart or citation. For example, sleepwalking is comorbid with depression and sleep apnea, both of which can be alleviated by regular exercise. QED. Also, Google NGram prevalences of "sleepwalking" and "overweight" have 0.97 correlation. Coincidence? I think not!

Comment author: Jacobian 03 October 2015 04:50:33PM *  13 points [-]

Presented at BAH Fest in front of more than 1,000 people at MIT, poked fun at Max Tegmark on stage (for which he gave me his signed book), went drinking with Zach Weinersmith and the other awesome contestants, drank enough beer to numb the pain of actually coming in 2nd.

Comment author: Jacobian 03 October 2015 04:41:13PM *  6 points [-]

Spent about 20 minutes playing online, I have some technical notes and general impressions.

Technical (skip this if you're not Jimrandomh):

  • The timer feels way too long, especially as people get to know the cards better and don't have to read all of them.
  • When choosing card pairs they are displayed in long rows, so for 3 people someone's first and second cards are on different rows. That's very unintuitive. Maybe put the pairs in separated columns?
  • When judging, seeing the timing of the cards coming out can skew the judgement, and also makes it easy to guess which card is the control.
  • The website works smoothly, well done!

Here are my main takeaways:

  • The cards are excellent, a lot of them are either very funny or are doing a good job explaining things quickly. For some, it's hard to tell which :)
  • Unfortunately, the jokes that happen during play itself aren't funny at all compared to the cards. A lot of times there isn't a single card that will give a "funny" answer, am I supposed to choose the logically appropriate one instead, then? I wonder if I'd be more likely to buy the best cards as a poster than as a card game.

I'm going to try and invite some non-LW friends to play, see if they like it or run away screaming in confusion.

Comment author: ike 30 July 2015 01:44:28AM *  1 point [-]

That's exactly what I used it for in my calculation, I didn't misunderstand that. Your computation of "conservation of expected evidence" simply does not work unless your prior is extremely high to begin with. Put simply, you cannot be 99% sure that you'll later change your current belief in H of p to anything greater than 100*p/99, which places a severe lower bound on p for a likelihood ratio of 20:1.

Comment author: Jacobian 30 July 2015 09:05:50PM 2 points [-]

Yes! It worked! I learned something by getting embarrassed online!!!

ike, you're absolutely correct. I applied conservation of expected evidence to likelihood ratios instead of to posterior probabilities, and thus didn't realize that the prior puts bounds on expected likelihood ratios. This also means that the numbers I suggested (1% of 1:2000, 99% of 20:1) define the prior precisely at 98.997%.

I'm going to leave the fight to defend the reputation of Bayesian inference to you and go do some math exercises.

View more: Prev | Next