Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: shminux 27 March 2015 09:40:13PM *  12 points [-]

There is no reason one can't link a blog post here and start a discussion. I don't know Scott personally, but I am quite sure he will not mind. I am almost as sure he will not bother commenting in both places.

What I would really like to see is someone compiling a book called, say, SSCequences from his best posts and maybe best comments on them. And an audiobook. And...

Comment author: Solvent 28 March 2015 02:47:54AM 8 points [-]

I have made bootleg PDFs in LaTeX of some of my favorite SSC posts, and gotten him to sign printed out and bound versions of them. At some point I might make my SSC-to-LaTeX script public...

Comment author: ozziegooen 25 December 2014 05:56:01AM *  6 points [-]

I was one of the people who expressed opinion against the LW content. In general I liked the event, but found those parts off-putting. I'm really surprised that people new to it seemed so oblivious.

Perhaps one reason why people who were familiar with that content were hesitant about showing it to others, was that they were afraid it would reflect poorly on them. If I brought a bunch of 'regular' friends to a 'transhumanist' meetup that I told them I was somewhat involved in, I would be really be afraid of them getting a poor impression of transhumanism.

It's kind of like taking your significant other to meet your parents. You're significant other may not mind your parents quirks (or vice versa), but you notice every one and horrified for them.

Another thing that comes to mind is that they some of the 'serious' talk was controversial even among this crowd. Personally I really don't believe that humans should live forever, for example. Here the people who care the most about it would also care the most about discrepancies. For instance, a very devout Catholic would be the first to get angered by what they feel to be a wrong or mistaken representation of Catholicism at what seems like a very sacred event.

Overall though, thanks for getting feedback and writing this all up! I'm really interested in how it progresses.

Comment author: Solvent 25 December 2014 06:32:04AM 1 point [-]

I feel exactly the same way about the controversial opinions.

Comment author: ShIxtan 09 October 2014 12:01:19AM *  4 points [-]

Long time lurker here, I just recently got accepted to App Academy (A big part of my inspiration for applying came from this post) And I'm really excited to attend some meetups in the area.

I have a few questions for Less Wrong people in the area, and this seemed like a good place to post them:

  1. I'll be going in December, any chance I'll have Less Wrong company?

  2. I understand that at least a few folks from here have been to app academy. Any advice? I've got an Associates in CS, and none of the prep work they've given me is too difficult, but is there anything else I should do to prepare?

  3. They allow students to stay there, but I'm hoping to bring my fiancee with me. Unfortunately, rent seems to be ridiculous and I have no idea where to look (I've never moved to a city that wasn't driving distance from me). What's the best way to find apartments in SF?

  4. Related to the above, is anyone in the SF area looking for a roommate(or two) starting December? We are clean, quiet, and can be very unobtrusive if need be. The main issue I see is that we would prefer to also bring our cat along.

Comment author: Solvent 28 October 2014 10:40:03AM 0 points [-]

I used to work at App Academy, and have written about my experiences here and here.

You will have a lot of LW company in the Bay Area (including me!) There will be another LWer who isn't Ozy in that session too.

I'm happy to talk to you in private if you have any more questions.

Comment author: John_Maxwell_IV 18 April 2014 05:58:45PM *  3 points [-]

This guide looks decent to me. I've heard that programming bootcamps like AppAcademy, etc. are already doing a lot to increase the number of entry-level Rails developers in the market. I haven't really heard of any non-web-development focused bootcamps, though. So it might not be a bad idea to focus on building skills in mobile development (supposedly iOS is really hot right now), data science, computer security, etc. instead of Rails and JavaScript. For someone taking this route, I would recommend Learn Python the Hard Way to gain a foundation in programming (Ruby tends to be used for web development only; Python is used for all sorts of stuff) and then choose a specialty and focus on it.

Edit: this guide seems to confirm my intuition that most bootcamps are web-focused. Additionally, I would argue that the web application programming model is a fairly inelegant one and might be a relatively bad place to start learning programming conceptually. Here is a salary guide for various computer programming specialties; you can also do searches like this. By the way, I've heard that White Hat Security is a good place to get an entry-level application security job, even if you don't have any programming experience.

Comment author: Solvent 23 April 2014 12:52:33AM *  2 points [-]

Zipfian Academy is a bootcamp for data science, but it's the only non web dev bootcamp I know about.

Comment author: Jayson_Virissimo 08 April 2014 06:06:03PM *  11 points [-]

App Academy has been discussed here before and several Less Wrongers have attended (such as ChrisHallquist, Solvent, Curiouskid, and Jack).

I am considering attending myself during the summer and am soliciting advice pertaining to (i) maximing my chance of being accepted to the program and (ii) maximing the value I get out of my time in the program given that I am accepted. Thanks in advance.

EDIT: I ended up applying and just completed the first coding test. Wasn't too difficult. They give you 45 minutes, but I only needed < 20.

EDIT2: I have reached the interview stage. Thanks everyone for the help!

EDIT3: Finished the interview. Now awaiting AA's decision.

EDIT4: Yet another interview scheduled...this time with Kush Patel.

EDIT5: Got an acceptance e-mail. Decision time...

EDIT6: Am attending the August cohort in San Francisco.

Comment author: Solvent 09 April 2014 09:45:18PM *  6 points [-]

I work at App Academy, and I'm very happy to discuss App Academy and other coding bootcamps with anyone who wants to talk about them with me.

I have previously Skyped LWers to help them prepare for the interview.

Contact me at bshlegeris@gmail.com if interested (or in comments here).

Comment author: ChrisHallquist 03 March 2014 12:42:28AM *  5 points [-]

So although I would endorse Aumann-adjusting as a final verdict with many of the people on this site, I think it's great that we have discussions - even heated discussions - first, and I think a lot of those discussions might look from the outside like disrespect and refusal to Aumann adjust.

I agree that what look like disrespectful discussions at first could eventually lead to Aumann agreement, but my impression is that there are a lot of persistent disagreements within the online rationalist community. Eliezer's disagreements with Robin Hanson are well-known. My impression is that even people within MIRI have persistent disagreements with each other, though not as big as the Eliezer-Robin disagreements. I don't know for sure Alicorn and I would continue to disagree about the ethics of white lies if we talked it out thoroughly, but it wouldn't remotely surprise me. Et cetera.

The role that IQ is playing here is that of a quasi-objective Outside View measure of a person's ability to be correct and rational. It is, of course, a very very lossy measure that often goes horribly wrong. On the other hand, it makes a useful counterbalance to our subjective measure of "I feel I'm definitely right; this other person has nothing to teach me."

So we have two opposite failure modes to avoid here. The first failure mode is the one where we fetishize the specific IQ number even when our own rationality tells us something is wrong - like Plantiga being apparently a very smart individual, but his arguments being terribly flawed. The second failure mode is the one where we're too confident in our own instincts, even when the numbers tell us the people on the other side are smarter than we are. For example, a creationist says "I'm sure that creationism is true, and it doesn't matter whether really fancy scientists who use big words tell me it isn't."

I guess I need to clarify that I think IQ is a terrible proxy for rationality, that the correlation is weak at best. And your suggested heuristic will do nothing to stop high IQ crackpots from ignoring the mainstream scientific consensus. Or even low IQ crackpots who can find high IQ crackpots to support them. This is actually a thing that happens with some creationists—people thinking "because I'm an <engineer / physicist / MD / mathematician>, I can see those evolutionary biologists are talking nonsense." Creationists would do better to attend to the domain expertise of evolutionary biologists. (See also: my post on the statistician's fallacy.)

I'm also curious as to how much of your willingness to agree with me in dismissing Plantinga is based on him being just one person. Would you be more inclined to take a sizeable online community of Plantingas seriously?

Unless you are way way way more charitable than I am, I have a hard time believing that you are anywhere near the territory where the advice "be less charitable" is more helpful than the advice "be more charitable".

As I said above, you can try to pinpoint where to apply this advice. You don't need to be charitable to really stupid people with no knowledge of a field. But once you've determined someone is in a reference class where there's a high prior on them having good ideas - they're smart, well-educated, have a basic committment to rationality - advising that someone be less charitable to these people seems a lot like advising people to eat more and exercise less - it might be useful in a couple of extreme cases, but I really doubt it's where the gain for the average person lies.

On the one hand, I dislike the rhetoric of charity as I see it happen on LessWrong. On the other hand, in practice, you're probably right that people aren't too charitable. In practice, the problem is selective charity—a specific kind of selective charity, slanted towards favoring people's in-group. And you seem to endorse this selective charity.

I've already said why I don't think high IQ is super-relevant to deciding who you should read charitably. Overall education also doesn't strike me as super-relevant either. In the US, better educated Republicans are more likely to deny global warming and think that Obama's a Muslim. That appears to be because (a) you can get a college degree without ever taking a class on climate science and (b) more educated conservatives are more likely to know what they're "supposed" to believe about certain issues. Of course, when someone has a Ph.D. in a relevant field, I'd agree that you should be more inclined to assume they're not saying anything stupid about that field (though even that presumption is weakened if they're saying something that would be controversial among their peers).

As for "basic commitment to rationality," I'm not sure what you mean by that. I don't know how I'd turn it into a useful criterion, aside from defining it to mean people I'd trust for other reasons (e.g. endorsing standard attitudes of mainstream academia). It's quite easy for even creationists to declare their commitment to rationality. On the other hand, if you think someone's membership in the online rationalist community is a strong reason to treat what they say charitably, yeah, I'm calling that self-congratulatory nonsense.

And that's the essence of my reply to your point #5. It's not people having self-congratulatory attitudes on an individual level. It's the self-congratulatory attitudes towards their in-group.

Comment author: Solvent 03 March 2014 01:32:07AM 2 points [-]

I don't know for sure Alicorn and I would continue to disagree about the ethics of white lies if we talked it out thoroughly, but it wouldn't remotely surprise me.

That's a moral disagreement, not a factual disagreement. Alicorn is a deontologist, and you guys probably wouldn't be able to reach consensus on that no matter how hard you tried.

LINK: In favor of niceness, community, and civilisation

26 Solvent 24 February 2014 04:13AM

Scott, known on LessWrong as Yvain, recently wrote a post complaining about an inaccurate rape statistic.

Arthur Chu, who is notable for winning money on Jeopardy recently, argued against Scott's stance that we should be honest in arguments in a comment thread on Jeff Kaufman's Facebook profile, which can be read here.

Scott just responded here, with a number of points relevant to the topic of rationalist communities.

I am interested in what LW thinks of this.

Obviously, at some point being polite in our arguments is silly. I'd be interested in people's opinions of how dire the real world consequences have to be before it's worthwhile debating dishonestly.

Comment author: JonahSinick 17 February 2014 04:26:51AM 1 point [-]

My thinking was that you can choose to be one of the people who is sufficiently committed, though I acknowledge that this may not be realistic. I just added "If you're sufficiently committed, the expected benefits that you stand to gain from self-help CBT."

Comment author: Solvent 17 February 2014 08:23:45PM 1 point [-]

I interpreted that bit as "If you're the kind of person who is able to do this kind of thing, then self-administered CBT is a great idea."

Comment author: ChristianKl 11 February 2014 04:16:03PM 0 points [-]

Almost everyone who goes through the program gets a job, with an average salary above $90k.

What does almost mean in percentages?

How many people drop out of the program and how many complete it?

Comment author: Solvent 12 February 2014 10:44:35PM 2 points [-]

Of the people who graduated more than 6 months ago and looked for jobs (as opposed to going to university or something), all have jobs.

About 5% of people drop out of the program.

Comment author: RichardKennaway 31 January 2014 11:36:16AM *  0 points [-]

ETA: Note that I work for App Academy.

Any comment on this? (News article a couple of days ago on gummint regulators threatening to shut down App Academy and several similar named organisations.)

Comment author: Solvent 01 February 2014 09:54:31PM *  0 points [-]

It will probably be fine. See here.

View more: Next