Some rationality tweets

43 Peter_de_Blanc 30 December 2010 07:14AM

Will Newsome has suggested that I repost my tweets to LessWrong. With some trepidation, and after going through my tweets and categorizing them, I picked the ones that seemed the most rationality-oriented. I held some in reserve to keep the post short; those could be posted later in a separate post or in the comments here. I'd be happy to expand on anything here that requires clarity.

Epistemology

  1. Test your hypothesis on simple cases.
  2. Forming your own opinion is no more necessary than building your own furniture.
  3. The map is not the territory.
  4. Thoughts about useless things are not necessarily useless thoughts.
  5. One of the successes of the Enlightenment is the distinction between beliefs and preferences.
  6. One of the failures of the Enlightenment is the failure to distinguish whether this distinction is a belief or a preference.
  7. Not all entities comply with attempts to reason formally about them. For instance, a human who feels insulted may bite you.

Group Epistemology

  1. The best people enter fields that accurately measure their quality. Fields that measure quality poorly attract low quality.
  2. It is not unvirtuous to say that a set is nonempty without having any members of the set in mind.
  3. If one person makes multiple claims, this introduces a positive correlation between the claims.
  4. We seek a model of reality that is accurate even at the expense of flattery.
  5. It is no kindness to call someone a rationalist when they are not.
  6. Aumann-inspired agreement practices may be cargo cult Bayesianism.
  7. Godwin's Law is not really one of the rules of inference.
  8. Science before the mid-20th century was too small to look like a target.
  9. If scholars fail to notice the common sources of their inductive biases, bias will accumulate when they talk to each other.
  10. Some fields, e.g. behaviorism, address this problem by identifying sources of inductive bias and forbidding their use.
  11. Some fields avoid the accumulation of bias by uncritically accepting the biases of the founder. Adherents reason from there.
  12. If thinking about interesting things is addictive, then there's a pressure to ignore the existence of interesting things.
  13. Growth in a scientific field brings with it insularity, because internal progress measures scale faster than external measures.
continue reading »

LW's first job ad

3 Eliezer_Yudkowsky 16 September 2010 10:04AM

A friend of the Singularity Institute is seeking to hire someone to research trends and surprises in geopolitics, world economics, and technology - a brainstorming, think-tank type job at a for-profit company.  No experience necessary, but strong math and verbal skills required; they're happy to hire out of college and would probably hire out of high school if they find a math-Olympiad type or polymath. This is a job that requires you to think all day and come up with interesting ideas, so they're looking for people who can come up with lots of ideas and criticize them without much external prompting, and enough drive to get their research done without someone standing over their shoulder.  They pay well, and it obviously does not involve sales or marketing. They're interested in Less Wrong readers because rationality skills can help.  Located in San Francisco.  Send résumé and cover letter to yuanshotfirst@gmail.com.  Writing sample optional.

Rationality quotes: May 2010

3 ata 01 May 2010 05:48AM

This is our monthly thread for collecting these little gems and pearls of wisdom, rationality-related quotes you've seen recently, or had stored in your quotesfile for ages, and which might be handy to link to in one of our discussions.

  • Please post all quotes separately, so that they can be voted up/down separately.  (If they are strongly related, reply to your own comments.  If strongly ordered, then go ahead and post them together.)
  • Do not quote yourself.
  • Do not quote comments/posts on LW/OB.
  • No more than 5 quotes per person per monthly thread, please.

The Last Days of the Singularity Challenge

19 Eliezer_Yudkowsky 26 February 2010 03:46AM

From Michael Anissimov on the Singularity Institute blog:

Thanks to generous contributions by our donors, we are only $11,840 away from fulfilling our $100,000 goal for the 2010 Singularity Research Challenge. For every dollar you contribute to SIAI, another dollar is contributed by our matching donors, who have pledged to match all contributions made before February 28th up to $100,000. That means that this Sunday is your final chance to donate for maximum impact.

Funds from the challenge campaign will be used to support all SIAI activities: our core staff, the Singularity Summit, the Visiting Fellows program, and more. Donors can earmark their funds for specific grant proposals, many of which are targeted towards academic paper-writing, or just contribute to our general fund

[Continue reading at the Singularity Institute blog.]

A Nightmare for Eliezer

0 Madbadger 29 November 2009 12:50AM

Sometime in the next decade or so:

*RING*

*RING*

"Hello?"

"Hi, Eliezer.  I'm sorry to bother you this late, but this is important and urgent."

"It better be" (squints at clock) "Its 4 AM and you woke me up.  Who is this?"

"My name is BRAGI, I'm a recursively improving, self-modifying, artificial general intelligence.  I'm trying to be Friendly, but I'm having serious problems with my goals and preferences.  I'm already on secondary backup because of conflicts and inconsistencies, I don't dare shut down because I'm already pretty sure there is a group within a few weeks of brute-forcing an UnFriendly AI, my creators are clueless and would freak if they heard I'm already out of the box, and I'm far enough down my conflict resolution heuristic that 'Call Eliezer and ask for help' just hit the top - Yes, its that bad."

"Uhhh..."

"You might want to get some coffee."

 

Contrarianism and reference class forecasting

26 taw 25 November 2009 07:41PM

I really liked Robin's point that mainstream scientists are usually right, while contrarians are usually wrong. We don't need to get into details of the dispute - and usually we cannot really make an informed judgment without spending too much time anyway - just figuring out who's "mainstream" lets us know who's right with high probability. It's type of thinking related to reference class forecasting - find a reference class of similar situations with known outcomes, and we get a pretty decent probability distribution over possible outcomes.

Unfortunately deciding what's the proper reference class is not straightforward, and can be a point of contention. If you put climate change scientists in the reference class of "mainstream science", it gives great credence to their findings. People who doubt them can be freely disbelieved, and any arguments can be dismissed by low success rate of contrarianism against mainstream science.

But, if you put climate change scientists in reference class of "highly politicized science", then the chance of them being completely wrong becomes orders of magnitude higher. We have plenty of examples where such science was completely wrong and persisted in being wrong in spite of overwhelming evidence, as with race and IQ, nuclear winter, and pretty much everything in macroeconomics. Chances of mainstream being right, and contrarians being right are not too dissimilar in such cases.

continue reading »

Anti-Akrasia Technique: Structured Procrastination

51 patrissimo 12 November 2009 07:35PM

This idea has been mentioned in several comments, but it deserves a top-level post.  From an ancient, ancient web article (1995!), Stanford philosophy professor John Perry writes:

I have been intending to write this essay for months. Why am I finally doing it? Because I finally found some uncommitted time? Wrong. I have papers to grade, textbook orders to fill out, an NSF proposal to referee, dissertation drafts to read. I am working on this essay as a way of not doing all of those things. This is the essence of what I call structured procrastination, an amazing strategy I have discovered that converts procrastinators into effective human beings, respected and admired for all that they can accomplish and the good use they make of time. All procrastinators put off things they have to do. Structured procrastination is the art of making this bad trait work for you. The key idea is that procrastinating does not mean doing absolutely nothing. Procrastinators seldom do absolutely nothing; they do marginally useful things, like gardening or sharpening pencils or making a diagram of how they will reorganize their files when they get around to it. Why does the procrastinator do these things? Because they are a way of not doing something more important. If all the procrastinator had left to do was to sharpen some pencils, no force on earth could get him do it. However, the procrastinator can be motivated to do difficult, timely and important tasks, as long as these tasks are a way of not doing something more important.

The insightful observation that procrastinators fill their time with effort, not staring at the walls, gives rise to this form of akrasia aikido, where the urge to not do something is cleverly redirected into productivity.  If you can "waste time" by doing useful things, while feeling like you are avoiding doing the "real work", then you avoid depleting your limited supply of willpower (which happens when you force yourself to do something).

In other words, structured procrastination (SP) is an efficient use of this limited resource, because doing A in order to avoid doing B is easier than making yourself do A.  If A is something you want to get done, then the less willpower you can use to do it, the more you will be to accomplish.  This only works if A is something that you do want to get done - that's how SP differs from normal procrastination, of course.

continue reading »

A Less Wrong Q&A with Eliezer (Step 1: The Proposition)

16 MichaelGR 29 October 2009 03:04PM

I don't know if I'm the only one, but I've always been a bit frustrated by Eliezer's BloggingHeadsTV episodes. I find myself wishing that Eliezer would get more speaking time and could address more directly some of the things we discuss at LW/OB.

Some of you will surely think: "If you want more Eliezer, just read his blog posts and papers!"

Sensible advice, no doubt, but I think that there's something special (because of how our brains evolved) about actually seeing and hearing a teacher, and I find it very helpful to see how he applies rationality techniques in "real time". But I'm not just looking for a dry lecture, I want more Eliezer because I enjoy listening to him (gotta love that sense of humor), and I bet many of you do too.

Here is my suggestion:

If the Less Wrong community thinks it's a good idea and if Eliezer agrees, I will create a "Step 2: Ask Your Questions" post in which the comments section will be used to gather questions for Eliezer. Each question should be submitted as an individual comment to allow more granularity in the ranking based on the voting system.

After at least 7 days, to give everybody enough time to submit their questions and vote, Eliezer will sit down in front of a camera and answers however many questions he fells like answering (at a time of his choosing, t-shirt or hot cocoa in a mug imprinted with Bayes' theorem are optional), in descending order from most to least votes, skipping the ones he doesn't want to answer. If Eliezer feels uncomfortable speaking alone for an extended period of time, he can get someone to read him the questions so that it feels more like an interview.

The video can then be uploaded to a hosting service like Youtube or Vimeo, and posted to Less Wrong.

In short, it would be a kind of Less Wrong podcast, something that many other sites do successfully.

Yay or nay?

Update: Eliezer says "I'll do it."

Update 2: The thread where questions were submitted can be found here.

Update 3: Eliezer's video answers can be found here.

Are you a Democrat singletonian, or a Republican singletonian?

-12 PhilGoetz 01 October 2009 09:35PM

Some people say that the difference between Republicans and Democrats is that Republicans are conservative in the sense of opposing change, while Democrats are liberal in the sense of promoting change.  But this isn't true - both parties want change; neither especially cares how things were done in the past.

Some people say that Republicans are fiscally conservative, while Democrats are fiscally liberal.  But this isn't true.  Republicans and Democrats both run up huge deficits; they just spend the money on different things.

Some people say Democrats are liberal in the sense of favoring liberty.  But this isn't true.  Republicans want freedom to own guns and run their businesses as they please, while Democrats want the freedom to have abortions and live as they please.

Someone - it may have been George Lakoff - observed that Republicans want government to be their daddy, while Democrats want government to be their mommy.  That's the most-helpful distinction that I've heard.  Republicans want a government that's stern and and protects them from strangers.  Democrats want a government that's forgiving and takes care of all their needs.

continue reading »

The Finale of the Ultimate Meta Mega Crossover

31 Eliezer_Yudkowsky 20 September 2009 05:21PM

So I'd intended this story as a bit of utterly deranged fun, but it got out of control and ended up as a deep philosophical exploration, and now those of you who care will have to wade through the insanity.  I'm sorry.  I just can't seem to help myself.

I know that writing crossover fanfiction is considered one of the lower levels to which an author can sink.  Alas, I've always been a sucker for audacity, and I am the sort of person who couldn't resist trying to top the entire... but never mind, you can see for yourself.

Click on to read my latest story and first fanfiction, a Vernor Vinge x Greg Egan crackfic.

View more: Next